专利摘要:
scanner with focus is revealed a portable scanner to obtain and / or measure the 3d geometry of at least a part of the surface of an object using confocal pattern projection techniques. specific embodiments are provided for intraoral scanning and scanning the inner part of the human ear.
公开号:BR112012000189B1
申请号:R112012000189
申请日:2010-06-17
公开日:2020-01-21
发明作者:A Qazi Arish;Öjelund Henrik;Hollenbeck Karl-Josef;Van Der Poel Mike;Kjaer Rasmus;Fisker Rune
申请人:3Shape As;
IPC主号:
专利说明:

“SCAN APPLIANCE WITH FOCUS.
[0001] The present invention relates to an apparatus and a method for 3D optical scanning of surfaces. The principle of the apparatus and method according to the invention can be applied to various contexts. A specific embodiment of the invention is particularly suitable for intraoral scanning, that is, direct scanning of surrounding teeth and soft tissues in the oral cavity. Other embodiments of the invention related to dentistry are suitable for scanning dental impressions, plaster models, wax molds, prostheses and dental abutments. Another embodiment of the invention is suitable for scanning the inner and outer part of a human ear or ear canal impressions. The invention can find use within the scanning of the 3D structure of the skin in dermatological or cosmetic / cosmetological applications, scanning of jewelery or wax models of whole jewelery or part of jewelery, scanning of industrial parts and even 3D scanning solved in time, such as like 3D scanning solved in time for moving industrial parts.
Background of the Invention [0002] The invention relates to the three-dimensional (3D) scanning of the surface geometry of objects. Scanning an object surface in 3 dimensions is a well-known field of study and the methods for scanning can be divided into contact and non-contact methods. An example of contact measurement methods are Coordinate Measuring Machines (CMM), which measure by letting a tactile probe explore the surface. The advantages include high precision, but the process is slow and a CMM is large and
Petition 870170040087, of 6/12/2017, p. 12/27
2/83 expensive. Non-contact measurement methods include X-rays and optical probes.
[0003] Confocal microscopy is an optical imaging technique used to increase micrograph contrast and / or to reconstruct three-dimensional images using a spatial orifice to eliminate out-of-focus light or flare in samples that are thicker than the focal plane.
[0004] A confocal microscope uses spot lighting and a hole in an optically conjugated plane in front of the detector to eliminate out-of-focus information. Only light within the focal plane can be detected. Since only one point is illuminated at a time in confocal microscopy, 2D images require raster scanning and 3D images require raster scanning in a series of focus planes.
[0005] In WO 00/08415, the principle of confocal microscopy is applied by illuminating the surface with a plurality of illuminated points. By varying the focal plane, specific positions of focal points on the surface can be determined. However, the determination of the surface structure is limited to those parts of the surface that are illuminated by a spot.
[0006] WO 2003/060587 refers to the optical sectioning of a sample under microscopy, in which the sample is illuminated with a lighting pattern. The focus positions of the image plane are determined by the characterization of an oscillatory component of the pattern. However, the focal plane can only be adjusted by moving the sample and the optical system in relation to each other, that is, closer or further away from each other. Thus, the controlled variation of the focal plane requires a spatial relationship
Petition 870170040087, of 6/12/2017, p. 12/28
3/83 controlled between the sample and the optical system, which is performed under a microscope. However, such a controlled spatial relationship is not applicable to, for example, a portable scanner.
[0007] US 2007/0109559 A1 describes a focusing scanner where distances from the focusing lens positions are found in which the maximum reflection intensity of the light beams incident on the object to be scanned is observed. In contrast to the invention described herein, this prior art does not exploit any predetermined measure of the lighting pattern and does not exploit any detection of contrast, and therefore the signal-to-noise ratio is suboptimal.
[0008] In WO 2008/125605, means for generating a pattern of variation in time composed of alternating divided images are described. This document describes a scanning method for obtaining an optical section of a scanning object using two different lighting profiles, for example, two opposite phase patterns. These two images are used to extract the optical section, and the method is limited to the acquisition of images from only two different lighting profiles. In addition, the method is based on a predetermined calibration that determines the phase shift between the two lighting profiles.
Summary of the Invention [0009] Thus, an objective of the invention is to provide a scanner that can be integrated into a manageable compartment, such as a portable compartment. Other objectives of the invention are: to discriminate the information out of focus and to provide a quick scanning time.
Petition 870170040087, of 6/12/2017, p. 12/29
4/83 [0010] This is achieved by a method and by a scanner for obtaining and / or measuring the 3D geometry of at least a part of the surface of an object, said scanner comprising:
- at least one camera that accommodates an array of sensor elements,
- means for generating a probe light that incorporates a spatial pattern,
- means for transmitting the probe light to the object, thus illuminating at least a part of the object with the said pattern in one or more configurations,
- means for transmitting at least a part of the light returned from the object to the camera,
- means for varying the position of the pattern's focus plane on the object, maintaining a fixed spatial relationship of the scanner and the object,
- means for obtaining at least one image of said sensor element array,
- means for evaluating a correlation measure at each focus plane position between at least one image pixel and a weight function, where the weight function is determined based on the information of the spatial pattern configuration;
- means of data processing for:
a) determine, by analyzing the correlation measure, the focus position (s) of:
- each of a plurality of image pixels for a series of focus plane positions, or
- each of a plurality of groups of image pixels for a series of focus plane positions, and
b) transform data into focus into 3D real-world coordinates.
Petition 870170040087, of 6/12/2017, p. 12/30
5/83 [0011] The method and apparatus described in this invention are to provide a 3D surface record of objects using light as a non-contact probing agent. The light is provided in the form of a lighting pattern to provide an oscillation of light on the object. The variation / oscillation in the pattern can be spatial, for example, a static checkerboard pattern, and / or it can vary in time, for example, by displacing a pattern through the object to be scanned. The invention provides a variation of the focus plane of the pattern over a series of focus plane positions, while maintaining a fixed spatial relationship of the scanner and the object. This does not mean that the scan must be provided with a fixed spatial relation of the scanner and the object, only that the plane of focus can be varied (scanned) with a fixed spatial relation of the scanner and the object. This provides a portable scanner solution based on the present invention.
[0012] In some embodiments, the signals from the sensor element array are light intensity.
[0013] One embodiment of the invention comprises a first optical system, such as an array of lenses, for transmitting the probe light to the object and a second optical system for the imaging light returned from the object to the camera. In the preferred embodiment of the invention, only an optical system illustrates the pattern on the object and illustrates the object, or at least part of the object, on the camera, preferably along the same optical axis, however, along optical paths opposites.
[0014] In the preferred embodiment of the invention, an optical system provides an image formation of the pattern on the object to be scanned and the object to be scanned for
Petition 870170040087, of 6/12/2017, p. 12/31
6/83 the camera. Preferably, the focus plane is adjusted in such a way that the image of the pattern on the scanned object is shifted along the optical axis, preferably in equal steps from one end of the scanning region to the other. The probe light incorporating the pattern provides a pattern of light and darkness on the object. Specifically, when the pattern is varied over time for a fixed focus plane, then the regions in focus on the object will exhibit a pattern of light and dark oscillation. Out-of-focus regions will exhibit less or no contrast in light fluctuations.
[0015] Generally, we consider the case in which the light incident on the object is reflected diffusively and / or specularly from the surface of the object. But it is understood that the device and scanning method are not limited to this situation. They are also applicable, for example, to the situation in which the incident light penetrates the surface and is reflected and / or scattered and / or gives rise to fluorescence and / or phosphorescence in the object. The internal surfaces of a sufficiently translucent object can also be illuminated by the lighting pattern and transformed into an image on the camera. In this case, volumetric scanning is possible. Some plastic organisms are examples of such objects.
[0016] When a temporal variation pattern is applied, a single subscan can be obtained by collecting a variety of 2D images in different positions of the focus plane and in different instances of the pattern. As the focus plane coincides with the scanning surface in a single pixel position, the pattern will be projected on the surface point in focus and with high contrast, thus giving rise to a wide variation,
Petition 870170040087, of 6/12/2017, p. 12/28
7/83 or amplitude, of the pixel value over time. For each pixel, it is thus possible to identify individual settings of the focus plane to which each pixel will be in focus. Using the knowledge of the optical system used, it is possible to transform the contrast information as a function of the position of the focus plane into 3D surface information, on an individual pixel basis.
[0017] Thus, in an embodiment of the invention, the focus position is calculated by determining the amplitude of light oscillation for each of a plurality of sensor elements for a range of focus planes.
[0018] For a static pattern, a single subscan can be obtained by collecting a variety of 2D images at different positions in the focus plane. As the focus plane coincides with the scanning surface, the pattern will be projected at the point of focus surface with high contrast. The high contrast gives rise to a great spatial variation of the static pattern on the object's surface, thus providing a great variation, or amplitude, of the pixel values over a group of adjacent pixels. For each group of pixels it is therefore possible to identify individual settings of the focus plane for which each group of pixels will be in focus. Using the knowledge of the optical system used, it is possible to transform the contrast information as a function of the position of the focus plane into 3D surface information, on an individual pixel group basis.
[0019] Thus, in an embodiment of the invention, the focus position is calculated by determining the amplitude of light oscillation for each of a plurality of groups of sensor elements for a range of planes.
Petition 870170040087, of 6/12/2017, p. 12/33
8/83 focus.
[0020] The 2D to 3D conversion of image data can be performed in several ways known in the art. That is, the 3D surface structure of the probed object can be determined by finding the plane corresponding to the maximum light oscillation amplitude for each sensor element, or for each group of sensor elements, in the sensor array of the camera when recording the amplitude of light for a range of different focus planes. Preferably, the focus plane is adjusted in equal steps from one end of the scanning region to the other. Preferably, the focus plane can be moved in a range large enough to at least match the surface of the object being scanned.
[0021] The present invention differs from WO 2008/125605 because, in embodiments of the present invention that use a time-varying pattern, the input images are not limited to two lighting profiles and can be obtained from any profile of standard lighting. This is because the orientation of the reference image does not depend entirely on a predetermined calibration, but on the specific time of the input image acquisition.
[0022] Thus, the document WO 2008/125605 applies specifically to exactly two patterns, which are physically realized by a chrome-on-glass mask as illuminated from either side, the back side being reflective. WO 2008/125605 thus has the advantage of not using any moving parts, but the disadvantage of a relatively weaker signal-to-noise ratio. In the present invention, there is the possibility of using any number of pattern configurations, which makes the
Petition 870170040087, of 6/12/2017, p. 12/34
9/83 calculation of the amplitude of light oscillation or the most accurate correlation measure.
Definitions [0023] Standard: A light signal comprising a spatial structure incorporated in the lateral plane. It can also be called “lighting pattern.
[0024] Pattern of variation over time: A pattern that varies over time, that is, the incorporated spatial structure varies over time. It can also be called “time-varying lighting pattern. Next, also called “fringes.
[0025] Static pattern: A pattern that does not vary over time, for example, a static checkerboard pattern or a static inline pattern.
[0026] Pattern configuration: The state of the pattern. Knowing the pattern configuration at a given time is equivalent to knowing the spatial structure of the lighting at that time. For a periodic pattern, the pattern configuration will include pattern phase information. If a surface element of the object to be scanned is photographed on the camera, then knowledge of the pattern's configuration is equivalent to knowledge of which part of the pattern is illuminating the surface element.
[0027] Focus plane: A surface where the light rays emitted from the pattern converge to form an image on the object to be scanned. The focus plane does not have to be flat. It may be a curved surface.
[0028] Optical system: An arrangement of optical components, for example, lenses, which transmit, collimate and / or reflect light, for example, transmitting probe light to the object, generating an image of the pattern on and / or on the object, and
Petition 870170040087, of 6/12/2017, p. 12/35
10/83 generating image of the object, or at least a part of the object, in the camera.
[0029] Optical axis: An axis defined by the propagation of a beam of light. An optical axis is preferably a straight line. In the preferred embodiment of the invention, the optical axis is defined by the configuration of a plurality of optical components, for example, the configuration of lenses in the optical system. There may be more than one optical axis, if, for example, an optical system transmits probe light to the object and another optical system illustrates the object in the camera. But, preferably, the optical axis is defined by the propagation of light in the optical system that transmits the pattern on the object and forms the image of the object on the camera. The optical axis generally coincides with the longitudinal axis of the scanner.
[0030] Optical path: The path defined by the propagation of light from the light source to the camera. In this way, a part of the optical path preferably coincides with the optical axis. While the optical axis is preferably a straight line, the optical path can be a non-straight line, for example, when light is reflected, scattered, bent, divided and / or provided in a similar way, for example, by means of beam splitters, mirrors, optical fibers and the like.
[0031] Telecentric system: An optical system that provides images in such a way that the main rays are parallel to the optical axis of the said optical system. In a telecentric system, points out of focus have substantially the same magnification as points in focus. This can provide an advantage in data processing. It is difficult to achieve an optical system perfectly
Petition 870170040087, of 6/12/2017, p. 12/36
11/83 telecentric, however, an optical system that is substantially telecentric or quasi-telecentric can be provided by careful optical design. Thus, when referring to a telecentric optical system, it must be understood that it can only be almost telecentric.
[0032] Scan length: A lateral dimension of the field of view. If the probe tip (i.e., the scanning head) comprises a folding optics to direct the probe light in a different direction, such as perpendicular to the optical axis, then the scanning length is the lateral dimension parallel to the axis optical.
[0033] Scanning object: The object to be scanned and on whose surface the scanner provides information. “The scanning object can simply be called an“ object.
[0034] Camera: Image formation sensor comprising a plurality of sensors that respond to light entering the image sensor. The sensors are preferably arranged in a 2D matrix in rows and columns.
[0035] Input signal: Light input signal or sensor input signal from the sensors in the camera. This can be integrated intensity of light incident on the sensor during the exposure time or integration of the sensor. In general, it translates to a pixel value within an image. It can also be called a “sensor signal.
[0036] Reference signal: A signal derived from the standard. A reference signal can also be designated by a weight function or weight vector or reference vector.
[0037] Correlation measure: A measure of the degree of correlation between a reference and input signal. In
Petition 870170040087, of 6/12/2017, p. 37/128
12/83 preference, the correlation measure is defined in such a way that, if the reference and input signal are linearly related to each other, then the correlation measure gets a greater magnitude than if they are not. In some cases, the correlation measure is an amplitude of light oscillation.
[0038] Image: An image can be seen as a 2D matrix of values (when obtained with a digital camera) or, in optics, an image indicates that there is a relationship between a captured surface and an image surface where the rays of light that emerge from a point on said captured surface substantially converge at a point on said image surface.
[0039] Intensity: In optics, intensity is a measure of the power of light per unit area. When recording images with a camera comprising a plurality of individual detection elements, the intensity can be used to designate the light signal recorded on the individual detection elements. In this case, the intensity reflects a time integration of the light power per unit area in the detection element over the exposure time involved in the image recording.
Mathematical notation
A A measure of correlation between the weight function and the recorded light signal. This can be a range of light oscillation.
I Light input signal or sensor input signal.
This can be integrated intensity of light incident on the sensor during the exposure time or integration of the sensor. In general, it translates to a pixel value within an image.
f Reference signal. It can also be called a value
Petition 870170040087, of 6/12/2017, p. 12/38
13/83 by weight.
n The number of measurements with a camera sensor and / or several camera sensors that are used to calculate a correlation measure.
H Image height in number of pixels
W Image width in number of pixels [0040] The symbols are also explained as needed in the text.
Detailed Description of the Invention [0041] The scanner preferably comprises at least one beam splitter located in the optical path. For example, an image of the object can be formed on the camera using a beam splitter. Exemplary uses of beam splitters are illustrated in the figures.
[0042] In a preferred embodiment of the invention, light is transmitted in an optical system which comprises a lens system. This lens system can transmit the pattern to the object and transforms reflected light from the object to the camera into an image.
[0043] In a telecentric optical system, points out of focus have the same magnification as points in focus. Telecentric projection can therefore significantly facilitate the mapping of data from 2D images acquired into 3D images. Thus, in a preferred embodiment of the invention, the optical system is substantially telecentric in the space of the probed object. The optical system can also be telecentric in the standard and camera space. Variable Focus [0044] A point of articulation of the invention is the variation, that is, scanning, of the focal plane without moving the scanner in relation to the object being scanned. Preferably,
Petition 870170040087, of 6/12/2017, p. 12/39
14/83 the focal plane can be varied, as varied periodically, while the means of generating the pattern, the camera, the optical system and the object to be scanned are fixed together. In addition, the acquisition time of the 3D surface must be short enough to reduce the impact of relative movement between the probe and the teeth, for example, to reduce the vibratory effect. In the preferred embodiment of the invention, the focus plane is varied by means of at least one focus element. Preferably, the focus plane is periodically varied with a predefined frequency. This frequency can be
in fur any less 1 Hz, such as at any less 2 Hz, 3, 4, 5, 6, 7, 8, 9 or at least 10 Hz, such as fur any less 20, 40, 60, 80 or fur any less 100 Hz.[0045] Preferably, the element of focus does
part of the optical system. That is, the focus element can be a lens in a lens system. A preferred embodiment comprises means, such as a translation stage, for adjusting and controlling the position of the focus element. In this way, the focus plane can be varied, for example, by translating the focus element back and forth along the optical axis.
[0046] If a focus element is moved back and forth with a frequency of several Hz, this can lead to scanner instability. A preferred embodiment of the invention thus comprises means for reducing and / or eliminating vibration and / or agitation of the focus element adjustment system, thereby increasing the stability of the scanner. This can be provided, at least partially, by means of fixing and / or maintaining the center of mass of the focus element adjustment system, such as
Petition 870170040087, of 6/12/2017, p. 40/128
15/83 a counterweight to substantially counterbalance the movement of the focus element; for example, by translating a counterweight opposite the movement of the focus element. Ease of operation can be achieved if the counterweight and the focus element are connected and operated by the same means of translation. However, this can only substantially reduce vibration for the first order. If a balanced counterweight device is rotated around the balanced counterweight axis, there may be problems related to the torque created by the counterweights. Another embodiment of the invention thus comprises means for reducing and / or eliminating vibration and / or agitation of first order, second order, third order and / or higher order of the focus element adjustment system, thereby increasing the stability of the scanner .
[0047] In another embodiment of the invention, more than one optical element is moved to displace the focal plane. In this embodiment, it is desirable that these elements are moved together and that the elements are physically adjacent.
[0048] In the preferred embodiment of the invention, the optical system is telecentric, or almost telecentric, for all positions of the focus plane. In this way, even if one or more lenses in the optical system can be moved back and forth to change the position of the focus plane, the telecentricity of the optical system is maintained.
[0049] The preferred embodiment of the invention comprises focus gear. The focus gear is the correlation between the movement of the lens and the movement of the position of the focus plane. For example, a focus gear of 2 means that a translation of the focus element
Petition 870170040087, of 6/12/2017, p. 41/128
16/83 of 1 mm corresponds to a translation of the position of the focus plane of 2 mm. The focus gear can be provided by a suitable design of the optical system. The advantage of the focus gear is that a small movement of the focus element can correspond to a wide variation in the position of the focus plane.
focus In achievements specific gives invention , a gear in focus is between 0, 1 and 100, such how in between 0.1 and 1, such as between 1 and 10 , such like between 2 and 8, such as in between 3 and 6, such as at any less 10, such how fur any less 20. [0050]In another concretization gives invention , O
focus element is a liquid lens. A liquid lens can control the focus plane without using moving parts.
Camera [0051] The camera can be a standard digital camera that accommodates a standard CCD or CMOS chip with an A / D converter per line of sensor elements (pixels). However, to increase the frame rate, the scanner according to the invention may comprise a high speed camera that accommodates multiple A / D converters per line of pixels, for example, at least 2, 4, 8 or 16 A / D converters D per line of pixels.
Pattern [0052] Another central element of the invention is the probe light with a built-in pattern that is projected onto the object being scanned. The pattern can be static or vary over time. The time variation pattern can provide a variation of light and darkness over and / or on the object. Specifically, when the pattern is varied in time for a fixed focus plane, then the regions in focus on the object will exhibit an oscillating pattern of light and darkness. The regions
Petition 870170040087, of 6/12/2017, p. 42/128
17/83 out of focus will exhibit less or no contrast in light fluctuations. The static pattern can provide a spatial variation of light and darkness over and / or on the object. Specifically, the regions in focus will exhibit an oscillating pattern of light and darkness in space. Out-of-focus regions will exhibit less or no contrast in spatial light fluctuations.
[0053] The light can be provided from an external light source, however, preferably, the scanner comprises at least one light source and pattern generation means to produce the pattern. It is advantageous in terms of the signal to noise ratio to design a light source so that the intensity in the unmasked parts of the pattern is as close as possible to uniform in space. In another embodiment, the light source and the pattern generation means are integrated into a single component, such as a segmented LED. A segmented LED can provide a static pattern and / or it can provide a pattern of variation in time itself, activating and deactivating the different segments in sequence. In one embodiment of the invention, the pattern of variation over time periodically varies over time. In another embodiment of the invention, the static pattern periodically varies in space.
[0054] The light from the light source (external or internal) can be transmitted through the pattern generation means, thus generating the pattern. For example, the pattern generating means comprises at least one translucent and / or transparent pattern element. To generate a pattern of variation in time, a wheel with an opaque mask can be used. For example, the mask comprises a plurality of radial rays, preferably arranged
Petition 870170040087, of 6/12/2017, p. 43/128
18/83 in a symmetrical order. The scanner may also comprise means for rotating and / or translating the pattern element. To generate a static pattern, a glass plate with an opaque mask can be used. For example, the mask comprises an inline pattern or a checkerboard pattern. In general, said mask preferably has a rotation and / or translation periodicity. The standard element is located in the optical path. In this way, light from the light source can be transmitted through the pattern element, for example, transmitted transversely through the pattern element. The time-varying pattern can then be generated by rotating and / or translating the pattern element. A pattern element that generates a static pattern does not need to be moved during a scan.
Correlation [0055] An objective of the invention is to provide a short scanning time and real-time processing, for example, to provide live feedback to a scanner operator to do a quick scan of an entire dental arch. However, real-time high-resolution 3D scanning creates a huge amount of data. Therefore, data processing must be provided in the scanner compartment, that is, close to the optical components, to reduce the data transfer rate for, for example, a cart, workstation or display. In order to speed up the data processing time and in order to extract information in focus with an optimal signal-to-noise ratio, several correlation techniques can be incorporated / implemented. This can, for example, be implemented in the camera's electronics to discriminate out-of-focus information. The standard is applied to provide
Petition 870170040087, of 6/12/2017, p. 44/128
19/83 lighting with a spatial structure incorporated in the object being scanned. The determination of the information in focus refers to the calculation of a correlation measure of this spatially structured light signal (which we call the input signal) with the variation of the standard itself (which we call the reference signal). In general, the magnitude of the correlation measure is high if the input signal matches the reference signal. If the input signal exhibits little or no variation, then the magnitude of the correlation measure is low. If the input signal exhibits a large spatial variation, but this variation is different from the variation in the reference signal, then the magnitude of the correlation measure is also low. In another embodiment of the invention, the scanner and / or the scanner head can be wireless, thereby simplifying the handling and operation of the scanner and increasing accessibility in difficult scanning situations, for example, intraoral or ear scanning. However, wireless operation can further increase the need for local data processing to prevent wireless transmission of raw 3D data.
[0056] The reference signal is provided by means of pattern generation and can be periodic. The variation in the input signal can be periodic and can be confined to one or a few periods. The reference signal can be determined independently of the input signal. Specifically in the case of a periodic variation, the phase between the oscillating input and the reference signal can be known independently of the input signal. In the case of a periodic variation, the correlation is typically related to the amplitude of the variation. If the phase between the reference signals and the oscillating input is not known, it is
Petition 870170040087, of 6/12/2017, p. 45/128
20/83 it is necessary to determine both the cosine and the sinusoidal part of the input signal before the amplitude of variation of the input signal can be determined. This is not necessary when the phase is known.
[0057] One way to define the correlation measure mathematically with a discrete set of measurements is as a product of points calculated from a signal vector, I = (Ii, In), with elements n> 1 representing sensor signals and a reference vector, f = (fi,.. f n ), of the same length as said reference weight signal vector. Correlation measure A is then given by:
n
A = fl = Y i f i I i i = 1 [0058] The indices in the elements in the signal vector represent sensor signals that are recorded at different times and / or at different sensors. In the case of continuous measurement, the above expression is easily generalized to involve integration instead of the sum. In this case, the integration parameter is time and / or one or more spatial coordinates.
[0059] A preferred embodiment is to remove the CC part of the correlation signal or correlation measure, that is, when the reference vector elements add up to zero (Σ = 1 / ί = o). The focus position can be found as an end of the correlation measure calculated over all positions of the focus element. It is observed that, in this case, the correlation measure is proportional to the sample's Pearson correlation coefficient between two variables. If the DC part is not removed, there may be a trend in the DC signal over all positions of the focus element, and
Petition 870170040087, of 6/12/2017, p. 46/128
21/83 this trend may be dominating numerically. In this situation, the focus position can still be found by analyzing the correlation measure and / or one or more of its derivatives, preferably after removing the trend.
[0060] Preferably, the global extreme must be found. However, artifacts, such as dirt in the optical system, can result in false global maximums. Therefore, it may be advisable to look for extreme locations in some cases. If the object being scanned is sufficiently translucent, it may be possible to identify internal surfaces or surface parts that are otherwise obstructed. In such cases, there may be several local extremes that correspond to surfaces and it may be advantageous to process several or all of the extremes.
[0061] The correlation measure can typically be calculated based on input signals that are available as digital images, that is, images with a finite number of discrete pixels. Therefore, conveniently, calculations for obtaining correlation measures can be performed for image pixels or groups of these. Correlation measures can then be viewed as pseudo-images.
[0062] The correlation measure applied in this invention is inspired by the principle of a blocking amplifier, in which the input signal is multiplied by the reference signal and integrated over a specified period. In this invention, a reference signal is provided by the standard.
Temporal Correlation [0063] The temporal correlation involves a variable pattern over time. The light signal on the detection elements
Petition 870170040087, of 6/12/2017, p. 47/128
22/83 individual light in the camera is recorded several times while the pattern setting is varied. The correlation measure is thus at least calculated with sensor signals recorded at different times.
[0064] A principle for estimating the amplitude of light oscillation in a light signal with periodic variation is taught in WO 98/45745, where the amplitude is calculated first by estimating a cosine and a sinusoidal part of the intensity oscillation of light. However, from a statistical point of view, this is not ideal because two parameters are estimated in order to calculate the amplitude.
[0065] In this embodiment of the invention, independent knowledge of the pattern configuration in each light signal recording allows to calculate the correlation measure in each light detection element.
[0066] In some embodiments of the invention, the scanner comprises means for obtaining knowledge of the pattern configuration. To provide such knowledge, the scanner preferably also comprises means for recording and / or monitoring the pattern of variation over time.
[0067] Each individual light detection element, that is, a sensor element, in the camera sees a variation in the light signal corresponding to the variation of light that illuminates the object.
[0068] One embodiment of the invention obtains the time variation of the pattern by translation and / or rotation of the pattern element. In this case, the pattern configuration can be achieved by using a position encoder on the pattern element combined with prior knowledge of the pattern geometry that gives rise to a pattern variation between individual detection elements. Knowledge of
Petition 870170040087, of 6/12/2017, p. 12/28
23/83 pattern configuration thus appears as a combination of knowledge of the pattern geometry that results in a variation between different elements of detection and registration and / or pattern monitoring during 3D scanning. In the case of a rotating wheel as the pattern element, the angular position of the wheel can then be obtained by an encoder, for example, mounted on the rim.
[0069] One embodiment of the invention involves a pattern that has translation and / or rotation periodicity. In this embodiment, there is a well-defined pattern oscillation period if the pattern is substantially translated and / or rotated at a constant speed.
[0070] One embodiment of the invention comprises means for sampling each of a plurality of sensor elements a plurality of times during a period of pattern oscillation, preferably sampled an integer number of times, such as sampling 2, 3, 4 , 5, 6, 7 or 8 times during each pattern oscillation period, thus determining the variation of light over a period.
[0071] The measure of temporal correlation between the variation of light and the pattern can be obtained by recording several images in the camera during a period of oscillation (or at least one period of oscillation). The number of images recorded during a period of oscillation is represented by n. The registration of the pattern position for each individual image combined with the pattern variation known independently over the entire detection element (ie, obtaining knowledge of the pattern configuration) and the recorded images allows an efficient extraction of the correlation measure in each individual detection element in the camera. For a light detection element with label j, the n signals
Petition 870170040087, of 6/12/2017, p. 49/128
24/83 of light recorded from this element are indicated Ii, j,
The correlation measure of this element, Al, can be expressed as
Ai [0072]
Here, the reference signal or weight function f is obtained from knowledge of the pattern configuration, f has two indices i, j. The variation of f with the first index is derived from knowing the pattern position during each image recording. The variation of f with the second index is derived from the knowledge of the pattern geometry that can be determined before the 3D scan.
[0073] Preferably, but not necessarily, the reference signal f tends to zero over time, that is, for all j, we have
i = l to suppress the CC part of the light variation or correlation measure. The focus position corresponding to the pattern in focus on the object for a single sensor element in the camera will be given by an extreme value of the correlation measure for that sensor element when the focus position is varied over a range of values. The focus position can be varied in equal steps from one end of the scanning region to the other.
[0074] To obtain a clear image of an object through a camera, the object must be in focus and the optics of the camera and the object must be in a fixed spatial relationship during the exposure time of the image sensor of the
Petition 870170040087, of 6/12/2017, p. 50/128
25/83 camera. Applied to the present invention, this should imply that the pattern and focus must be varied in discrete steps in order to be able to fix the pattern and focus for each image sampled on the camera, that is, fixed during the exposure time of the sensor array. However, to increase the sensitivity of the image data, the exposure time of the sensor array must be as high as the frame rate of the sensor allows. Thus, in the preferred embodiment of the invention, images are recorded (sampled) on the camera while the pattern is continuously variable (for example, by continuously rotating a pattern wheel) and the focus plane is moved continuously. This implies that the individual images will be slightly blurred as they are the result of a time integration of the image while the pattern is variable and the focus plane is moved. This is something that could be expected to lead to a deterioration in data quality, but in practice, the advantage of concurrent variation in the pattern and focus plane outweighs the disadvantage.
[0075] In another embodiment of the invention, images are recorded (sampled) on the camera while the pattern is fixed and the focus plane is moved continuously, that is, without movement of the pattern. This could be the case when the light source is a segmented light source, such as a segment LED that flashes appropriately. In this embodiment, the knowledge of the pattern is obtained by a combination of prior knowledge of the geometry of the individual segments in the segmented LED that cause a variation between the light sensing elements and the current applied to different LED segments in each recording.
[0076] In yet another embodiment of the invention, the
Petition 870170040087, of 6/12/2017, p. 51/128
26/83 images are recorded (sampled) on the camera while the pattern is continuously variable and the focus plane is fixed.
[0077] In yet another embodiment of the invention, images are recorded (sampled) on the camera while the pattern and focus plane are fixed.
[0078] The principle of temporal correlation can be applied, in general, within image analysis. Thus, another embodiment of the invention relates to a method for calculating the amplitude of an oscillation of light intensity in at least one light-sensitive element (photoelectric), said oscillation of light intensity generated by a lighting pattern that it varies periodically and said amplitude is calculated in at least one period of pattern oscillation, said method comprising the steps of:
- provide the following a predetermined number of sampling times during a pattern oscillation period:
• sampling of the light-sensitive element thus providing the signal for said light-sensitive element, and • providing an angular position and / or a phase of the lighting pattern that varies periodically for said sampling, and
- calculating said amplitude (s) by integrating the products of a predetermined periodic function and the signal of the light-sensitive element corresponding to said predetermined number of sampling times, wherein said periodic function is a function of the angular position and / or the phase of the lighting pattern varying periodically.
[0079] This can also be expressed as
Petition 870170040087, of 6/12/2017, p. 52/128
27/83
where A is the calculated correlation or amplitude measure, i is the index for each sampling, f is the periodic function, pi is the angular phase / position of the lighting pattern for sampling 1 and li is the signal from the light sensitive element for sampling 1. Preferably, the periodic function tends to zero during a period of pattern oscillation, that is,
Σ / (α) = ο · [0080] To generalize the principle to a plurality of light sensitive elements, for example, in a sensor array, the angular phase / position of the lighting pattern for a specific light sensitive element can consist of an angular phase / position associated with the lighting pattern plus a constant deviation associated with the specific light-sensitive element. Thus, the measure of correlation or amplitude of light oscillation in the light sensitive element j can be expressed as i
where 6j is the constant displacement for the light sensitive element j.
[0081] A lighting pattern that varies periodically can be generated by a rotating wheel with an opaque mask comprising a plurality of radial rays arranged in symmetrical order. The angular position of the wheel will thus correspond to the angular position of the pattern and this angular position can be obtained by an encoder mounted on the rim of the wheel. The pattern variation between different sensor elements for different positions of the pattern can be determined prior to 3D scanning in a calibration routine. A combination of knowledge of this variation in
Petition 870170040087, of 6/12/2017, p. 53/128
28/83 pattern and pattern position constitutes knowledge of the pattern configuration. A period of this pattern can, for example, be the time between two rays and the amplitude of a single or a plurality of light sensitive elements of this period can be calculated by sampling, for example, four times in this period.
[0082] A lighting pattern that varies periodically can be generated by a Ronchi regulation that moves orthogonally to the lines and the position is measured by an encoder. This position corresponds to the angular position of the generated pattern. Alternatively, a checkerboard pattern could be used.
[0083] A lighting pattern that varies periodically can be generated by a one-dimensional array of LEDs that can be controlled in a linear fashion.
[0084] A variable lighting pattern can be generated by an LCD or DLP-based projector.
Optical Correlation [0085] The aforementioned correlation principle (temporal correlation) requires some kind of recording of the time-varying pattern, for example, knowledge of the pattern setting on each recording of the light level on the camera. However, a correlation principle without this record can be provided in another embodiment of the invention. This principle is called “optical correlation.
[0086] In this embodiment of the invention, an image of the pattern itself and an image of at least part of the object to be scanned with the pattern projected on it are combined on the camera, that is, the image on the camera is an overlay of the pattern itself and the object being scanned with the pattern projected on it. A different way
Petition 870170040087, of 6/12/2017, p. 54/128
29/83 to express this is that the image on the camera is substantially a multiplication of an image of the pattern projected onto the object with the pattern itself.
[0087] This can be provided as follows. In another embodiment of the invention, the pattern generating means comprises a transparent pattern element with an opaque mask. The probe light is transmitted through the pattern element, preferably transmitted transversely through the pattern element. The light returned from the object to be scanned is retransmitted in the opposite way through the said pattern element and transformed into an image on the camera. This is preferably done in a way that the image of the pattern that illuminates the object and the image of the pattern itself coincide when both are captured on the camera. A particular example of a pattern is a rotating wheel with an opaque mask that comprises a plurality of radial spokes arranged in symmetrical order, such that the pattern has rotational periodicity. In this embodiment, there is a well-defined pattern oscillation period if the pattern is substantially rotated at a constant speed. We defined the oscillation period as 2.71IGG}.
[0088] We observe that, in the described embodiment of the invention, the lighting pattern is a pattern of light and darkness. A light sensing element in the camera with a signal proportional to the intensity of the integrated light during the time of integration of the camera õt with the label j, h is given by t + 5t Ij = KJ Tj (t ') Sj (t ^) dt 't [0089] Here, K is the proportionality constant of the
Petition 870170040087, of 6/12/2017, p. 55/128
30/83 sensor signal, t is the start of the camera integration time, Tj is the time-varying transmission of the part of the rotating pattern element transformed into the image of the jth light detection element, and Sj is the light intensity that varies in time with the light returned from the scanned object and transformed into an image in the twentieth light detection element. In the described embodiment, Tj is the step function substantially defined by Tj (t) = 0 for sin (ωt + φj)> 0 and Tj (t) = 1 elsewhere. φj is a phase dependent on the position of the jth image sensor.
[0090] The signal in the light detection element is a measure of the correlation of the pattern and the light returned from the object to be scanned. The transmission with temporal variation assumes the role of the reference signal and the light intensity that varies in time from the light returned from the scanned object assumes the function of the input signal. The advantage of this embodiment of the invention is that a normal CCD or CMOS camera with intensity detection elements can be used to directly record the correlation measurement, since it appears as an intensity in the detection elements. Another way of expressing this is that the calculation of the correlation measure occurs in the analog optical domain instead of in an electronic domain, such as an FPGA or a PC.
[0091] The focus position corresponding to the pattern that is in focus on the object to be scanned for a single sensor element in the camera will then be given by the maximum value of the correlation measure recorded with that sensor element when the focus position is varied in a range of values. The focus position can be varied in equal steps from one end of the scanning region to the other. One embodiment of the invention comprises means for
Petition 870170040087, of 6/12/2017, p. 56/128
31/83 record and / or integrate and / or monitor and / or store each of a plurality of the sensor elements in a series of focus plane positions.
[0092] Preferably, the global maximum should be found. However, artifacts, such as dirt in the optical system, can result in false global maximums. Therefore, it may be advisable to look for local maximums in some cases.
[0093] Since the reference signal does not tend to zero, the correlation measure has a DC component. Since the CC part is not removed, there may be a trend in the CC signal over all positions of the focus element, and this trend may be numerically dominant. In this situation, the focus position can still be found by analyzing the correlation measure and / or one or more of its derivatives.
[0094] In a further embodiment of the invention, the camera integration time is an integer M of the pattern oscillation period, that is, õt = 2nM / m. An advantage of this embodiment is that the magnitude of the correlation measure can be measured with a better noise ratio in the presence of noise than if the camera integration time is not an integer of the pattern's oscillation period.
[0095] In another further embodiment of the invention, the time of integration of the camera is much longer than the period of oscillation of the pattern, that is, õt »2nM / m. Often, the oscillation time of the pattern here would mean, for example, camera integration time at least 10 times the oscillation time or, more preferably, such as at least 100 or 1000 times the oscillation time. An
Petition 870170040087, of 6/12/2017, p. 57/128
The advantage of this embodiment is that there is no need to synchronize the camera integration time and the pattern oscillation time since, for very long camera integration times compared to the pattern oscillation time, the measurement of The recorded correlation is substantially independent of accurate synchronization.
[0096] Equivalent to the principle of temporal correlation, the principle of optical correlation can be applied, in general, in image analysis. Thus, another embodiment of the invention relates to a method for calculating the amplitude of an oscillation of light intensity in at least one light sensitive element (photoelectric), said oscillation of light intensity being generated by an overlap of a variable lighting pattern with itself, and said amplitude calculated by time integration of the signal of at least one light sensitive element over a plurality of pattern oscillation periods.
Spatial Correlation [0097] The above mentioned correlation principles (temporal correlation and optical correlation) require that the pattern be variable over time. If the optical system and the camera provide a lateral resolution that is at least twice what is necessary for scanning the object, then it is possible to scan with a static pattern, that is, a pattern that does not change over time. This principle is called “spatial correlation. The correlation measure is thus at least calculated with sensor signals recorded at different sensor locations.
[0098] The lateral resolution of an optical system must be understood as the capacity of optical elements in the optical system, for example, a lens system, in
Petition 870170040087, of 6/12/2017, p. 12/58
33/83 transforming spatial frequencies in the object being scanned up to a certain point. The modulation transfer curves of the optical system are typically used to describe the formation of spatial frequencies in an optical system. For example, the resolution of the optical system can be defined as the spatial frequency in the object being scanned, in which the modulation transfer curve has decreased to, for example, 50%. Camera resolution is a combined effect of the spacing of the individual elements of the camera sensor and the resolution of the optical system.
[0099] In spatial correlation, the correlation measure refers to a correlation between input signal and reference signal that occurs in space instead of time. Thus, in one embodiment of the invention, the resolution of the measured 3D geometry is equal to the resolution of the camera. However, for spatial correlation, the resolution of the measured 3D geometry is less than the camera resolution, such as at least 2 times lower, such as at least 3 times lower, such as at least 4 times lower, such as at least 5 times smaller, such as at least 10 times smaller. The array of sensor elements is preferably divided into groups of sensor elements, preferably rectangular groups, such as square groups of sensor elements, preferably adjacent sensor elements. The scanning resolution, that is, the measured 3D geometry, will then be determined by the size of these groups of sensor elements. The oscillation in the light signal is provided within these groups of sensor elements, and the amplitude of the light oscillation can then be obtained by analyzing the groups of sensor elements. The division of
Petition 870170040087, of 6/12/2017, p. 59/128
34/83 matrix of sensor elements in groups is preferably provided in the data processing phase, that is, the division is not a physical division, possibly requiring a specially adapted sensor matrix. Thus, the division into groups is “virtual even if the only pixel in a group is a real physical pixel.
[0100] In one embodiment of the invention, the pattern has a translation periodicity over at least one spatial coordinate. In another embodiment of the invention, the spatially periodic pattern is aligned with the rows and / or columns of the array of sensor elements. For example, in the case of a static line pattern, the lines or columns of pixels on the camera can be parallel to the lines of the pattern. Or, in the case of a static checkerboard pattern, the line and columns of the chessboard may be aligned with the lines and columns, respectively, of the camera's pixels. By alignment, it is understood that the image of the pattern on the camera is aligned with the “pattern of the sensor element in the sensor matrix of the camera. Thus, a given physical location and orientation of the pattern generation medium and the camera require a certain configuration of the scanner's optical components in order for the pattern to be aligned with the camera's sensor array.
[0101] In another embodiment of the invention, at least one spatial period of the pattern corresponds to a group of sensor elements. In another embodiment of the invention, all groups of sensor elements contain the same number of elements and have the same shape. For example, when the period of a checkerboard pattern corresponds to a square group of, for example, 2x2, 3x3, 4x4, 5x5, 6x6, 7x7, 8x8, 9x9, 10x10 or more pixels on the camera.
Petition 870170040087, of 6/12/2017, p. 60/128
35/83 [0102] In yet another embodiment, one or more edges of the pattern are aligned with and / or coincide with one or more edges of the sensor element array. For example, a checkerboard pattern may be aligned with the pixels of the camera in such a way that the edges of the image of the checkerboard pattern on the camera coincide with the edges of the pixels.
[0103] In spatial correlation, independent knowledge of the pattern configuration allows calculating the correlation measure in each light detection group. For spatially periodic illumination, this correlation measure can be calculated without having to estimate the cosine and the sinusoidal part of the light intensity oscillation. Knowledge of the pattern configuration can be obtained prior to 3D scanning.
[0104] In another embodiment of the invention, the correlation measure A Jr within a group of sensor elements with tag j, is determined using the following formula:
ni = l where n is the number of sensor elements in a group of sensors, fj = (fi, j,.. fn, j) is the reference signal vector obtained from knowledge of the pattern configuration and Ij = (li, j,.. In, j) is an input signal vector. For the case of sensors grouped in square regions with N sensors as square length, then n = N 2 .
[0105] Preferably, but not necessarily, the elements of the reference signal vector tend to zero in relation to space, that is, for all j, we have
i = l
Petition 870170040087, of 6/12/2017, p. 61/128
36/83 to remove the CC part of the correlation measure. The focus position corresponding to the pattern in focus on the object for a single group of sensor elements in the camera will be given by an extreme value of the correlation measure for that group of sensor elements when the focus position is varied over a range of values. The focus position can be varied in equal steps from one end of the scanning region to the other.
[0106] In the case of a static checkerboard pattern with edges aligned with the pixels of the camera and with groups of pixels having an even number of pixels, such as 2x2, 4x4, 6x6, 8x8, 10x10, a natural choice of the reference vector f would be that of its elements assuming the value 1 for the pixels that transform a shiny square of the chessboard into an image and -1 for the pixels that transform a dark square of the chessboard into an image.
[0107] Equivalent to the other correlation principles, the principle of spatial correlation can be applied, in general, in image analysis. In particular, in a situation where the camera's resolution is higher than necessary in the final image. Thus, another embodiment of the invention relates to a method for calculating the amplitude (s) of a light intensity oscillation in at least one group of light sensitive elements, said light intensity oscillation being generated by a pattern of static lighting varying spatially, the referred method comprising the steps of:
- provide the signal for each light-sensitive element in said group of light-sensitive elements, and
- calculate said amplitude (s) by integrating the products of a predetermined function and the sign of the element
Petition 870170040087, of 6/12/2017, p. 62/128
37/83 corresponding to the corresponding light-sensitive group of light-sensitive elements, wherein said predetermined function is a function that reflects the lighting pattern.
[0108] To generalize the principle to a plurality of light sensitive elements, for example, in a sensor array, the measure of correlation or amplitude of the light oscillation in group j can be expressed as λ, = É / ('uK .
Í = 1 where n is the number of sensor elements in the group j, Ii, j is the sign of the J'best sensor element in the group j and f (i, j) is a predetermined function that reflects the pattern.
[0109] In comparison with the temporal correlation, the spatial correlation has the advantage that no movement pattern is necessary. This implies that knowledge of the pattern configuration can be obtained prior to 3D scanning. On the other hand, the advantage of the temporal correlation is its higher resolution, since no pixel grouping is necessary.
[0110] All correlation principles, when incorporated with an image sensor that allows very high frame rates, allow 3D scanning of moving objects with little motion blur. It is also possible to track moving objects over time (4D scanning), with useful applications, for example, in machine vision and dynamic strain measurement. Very high frame rates in this context are at least 500, but preferably at least 2000 images per second.
Transformation of extremes of the correlation measure into 3D world coordinates
Petition 870170040087, of 6/12/2017, p. 63/128
38/83 [0111] The ratio of the identified focus position (s) to the camera sensor or sensor groups
of camera in coordinates in world 3D can be done per traced in rays through of system optical. Before such traced in rays be likely to to be performed, the parameters of
optical system need to be known. One embodiment of the invention comprises a calibration step to obtain such knowledge. Another embodiment of the invention comprises a calibration step in which images of an object of known geometry are recorded for a plurality of focusing positions. This object can be a flat checkerboard pattern. Then, the scanner can be calibrated by generating images traced by simulated rays of the calibration object and then adjusting the parameters of the optical system in order to minimize the difference between the simulated and recorded images.
[0112] In another embodiment of the invention, the calibration step requires recording images for a plurality of focus positions for several different calibration objects and / or several different orientations and / or positions of a calibration object.
[0113] With the knowledge of the parameters of the optical system, one can use the technique of tracing back rays to estimate 2D -> 3D mapping. This requires the scanner's optical system to be known, preferably, through calibration. The following steps can be performed:
1. From each pixel of the image (on the image sensor), trace a certain number of rays, starting from the image sensor and through the optical system (ray tracing backwards).
Petition 870170040087, of 6/12/2017, p. 64/128
39/83
2. From the rays they emit, calculate the point of focus, the point where all of these rays substantially intersect. This point represents the 3D coordinate of where a 2D pixel will be in focus, that is, the yield of the global maximum amplitude of light oscillation.
3. Generate a lookup table for all pixels with their corresponding 3D coordinates.
[0114] The above steps are repeated for a number of different focus lens positions covering the scanner's operating range.
Specular Reflections [0115] High spatial contrast of the pattern image in focus on the object is generally necessary to obtain a good signal-to-noise ratio of the correlation measure in the camera. This, in turn, may be necessary to obtain a good estimate of the focus position corresponding to an extreme in the correlation measure. This sufficient signal-to-noise ratio for successful scanning is generally easily achieved on objects with a diffuse surface and negligible light penetration. However, for some objects, it is difficult to achieve high spatial contrast.
[0116] A difficult object type, for example, is an object that has multiple scattering of incident light with a long light diffusion length compared to the smaller characteristic dimension of the spatial pattern represented on the object. A human tooth is an example of such an object. The human ear and the ear canal are other examples. In the case of intraoral scanning, scanning should preferably be provided without spraying and / or drying the teeth to reduce specular reflections and light penetration. The improved spatial contrast can
Petition 870170040087, of 6/12/2017, p. 65/128
40/83 be achieved by forming the preferential image of the specular reflection of the object's surface in the camera. Accordingly, an embodiment of the invention comprises means for preferential / selective image formation of specular reflected light and / or diffused reflected light. This can be provided if the scanner further comprises means for polarizing the probe light, for example, by means of at least one polarizing beam splitter. A polarizing beam splitter can, for example, be provided to form an image of the object on the camera. This can be used to extinguish specular reflections, because if the incident light is linearly polarized, a specular reflection of the object has the property of preserving its state of polarization.
[0117] The scanner according to the invention may further comprise means for changing the polarization state of the probe light and / or the reflected light of the object. This can be provided by means of a delay plate, preferably located in the optical path. In one embodiment of the invention, the delay plate is a quarter wave delay plate. A linearly polarized light wave is transformed into a circularly polarized light wave when passing a quarter-wave plate with a 45-degree orientation of its rapid axis to the direction of linear polarization. This can be used to increase specular reflections because a specular reflection of the object has the property of reversing the helicality of a circularly polarized light wave, while the light that is reflected by one or more dispersion events becomes depolarized.
The Field of View (Scanning Length) [0118] In one embodiment of the invention, the
Petition 870170040087, of 6/12/2017, p. 66/128
41/83 probe is transmitted to the object in a direction substantially parallel to the optical axis. However, for the scanning head to be inserted into a small space, such as a patient's oral cavity, the tip of the scanning head must be small enough. At the same time, the light coming out of the scanning head must leave the scanning head in a direction other than the optical axis. Accordingly, a further embodiment of the invention comprises means for directing the probe light and / or forming the image of an object in a direction other than the optical axis. This can be provided by means of at least one folding element, preferably located along the optical axis, to direct the probe light and / or image an object in a direction other than the optical axis. The folding element can be a light-reflecting element, such as a mirror or a prism. In one embodiment of the invention, a 45 degree mirror is used as a folding optic to direct the light path over the object. In this way, the probe light is guided in a direction perpendicular to the optical axis. In this embodiment, the height of the scanning tip is at least as large as the scanning length and, preferably, approximately equal in size.
[0119] One embodiment of the invention comprises at least two light sources, such as light sources with different wavelengths and / or different polarization. Preferably, also control means for controlling said at least two light sources. Preferably, this embodiment comprises means for combining and / or fusing light from said at least two light sources. In
Petition 870170040087, of 6/12/2017, p. 67/128
42/83 preference, also means to separate light from said at least two light sources. If waveguide light sources are used, they can be fused by waveguides. However, one or more diffusers can also be provided to fuse light sources.
[0120] Separation and / or fusion can be provided by at least one optical device that is partially light transmitting and partially light reflecting, said optical device preferably located along the optical axis, an optical device, such as a mirror coated or coated board. One embodiment comprises at least two of said optical devices, said optical devices being preferably displaced along the optical axis. Preferably, at least one of said optical devices transmits light at certain wavelengths and / or polarizations and reflects light at other wavelengths and / or polarizations.
[0121] An exemplary embodiment of the invention comprises at least a first and a second light source, said light sources having different wavelengths and / or polarization and wherein: a first optical device reflects light from said first light source in a different direction from the optical axis and transmits the light from said second light source, and a second optical device reflects light from said second light source in a different direction from the optical axis. Preferably, said first and second optical devices reflect the probe light in parallel directions, preferably in a direction perpendicular to the optical axis, thus forming images of different parts of the object's surface. These
Petition 870170040087, of 6/12/2017, p. 68/128
43/83 different parts of the object's surface can be at least partially overlapping.
[0122] In this way, for example, light from a first and a second light source that emits light of different wavelengths (and / or polarizations) is fused together using a suitably coated plate that transmits light from the first source of light and reflects the light from the second light source. At the scanning tip along the optical axis, a first optical device (for example, a properly coated plate, a dichroic filter) reflects the light from the first light source on the object and transmits the light from the second light source to a second optical device (for example, a mirror) at the end of the scanning tip, that is, further down the optical axis. During scanning, the focus position is shifted in such a way that the light from the first light source is used to project an image of the pattern to a position below the first optical device while the second light source is turned off. The 3D surface of the object in the region below the first optical device is recorded. Then the first light source is turned off and the second light source is turned on and the focus position is shifted in such a way that the light from the second light source is used to project an image of the pattern to a position below the second optical device. The 3D surface of the object in the region below the second optical device is recorded. The region covered with light from the two light sources, respectively, may partially overlap.
[0123] In another embodiment of the invention, the probe light is directed in a different direction from the optical axis by means of a curved bend mirror. This achievement
Petition 870170040087, of 6/12/2017, p. 69/128
44/83 can comprise one or more optical elements, such as lenses, with surfaces that can be aspherical to provide corrected optical images.
[0124] Another embodiment of the invention comprises at least one translation stage for translating the mirror (s) along the optical axis. This allows for a scanning tip with a height less than the scanning length. A long scanning length can be achieved by combining multiple scans with the mirror (s) in different positions along the optical axis.
[0125] In another embodiment of the invention, the probe light is directed in a different direction from the optical axis by means of at least one grid that provides anamorphic magnification so that the image of the pattern on the object to be scanned is stretched. The grid can be opened. In this embodiment, the light source needs to be monochromatic or semi-monochromatic.
[0126] The above-mentioned embodiments suitable for increasing the scanning length may comprise control means to provide coordination of the light sources and the focus element.
Color Scanning [0127] One embodiment of the invention is only to register the surface topology (geometry) of the object to be scanned. However, another embodiment of the invention is being adapted to obtain the color of the surface being scanned, that is, capable of registering the color of the individual surface elements of the object being scanned in conjunction with the surface topology of the object being scanned. To obtain color information, the light source must be white or comprise at least three light sources
Petition 870170040087, of 6/12/2017, p. 70/128
45/83 monochromatic with colors distributed across the visible part of the electromagnetic spectrum.
[0128] To provide color information, the array of sensor elements can be a color image sensor. The image sensor can accommodate a Bayer color filter scheme. However, other types of color image sensors can be provided, such as a color image sensor of the Foveon type, in which the image sensor provides color registration in each sensor element.
[0129] One embodiment of the invention comprises means that select one color of the probe light at a time, that is, selective switching between different colors of the probe light, thus illuminating the object with different colors. If a white light source is used, then some type of color filter must be provided. Preferably, it comprises a plurality of color filters, such as red, green and blue color filters, and means for inserting said color filters individually in front of the white light source, thereby selecting a color from the probe light.
[0130] In one embodiment of the invention, color filters are integrated into the pattern generating means, that is, the pattern generating means comprise color filters, such as translucent and / or transparent parts that are substantially colored in a monochrome. For example, a pattern element, such as a rotating wheel with an opaque mask and where the translucent / transparent parts are color filters. For example, one third of the wheel is red, one third is green and one third is blue.
[0131] The probe light of different colors can also
Petition 870170040087, of 6/12/2017, p. 71/128
46/83 be provided by at least three monochromatic light sources, such as lasers or LEDs, said light sources having wavelengths distributed across the visible part of the wavelength spectrum. This will, in general, also require means for the fusion of said light sources. For example, suitable coated plates. In the case of waveguide light sources, the fusion can be provided by a waveguide element.
[0132] To treat the different colors of the probe light, the optical system is preferably substantially achromatic.
[0133] One embodiment of the invention comprises means for switching between at least two colors, preferably three colors, such as red, green and blue, from the probe light to a focal plane position, that is, to a plane position single focal, it is possible to switch between different colors of the probe light. For example, turning different monochrome light sources on and off (having a single light source turned on at a time) or applying different color filters. In addition, the amplitude of the light signal of each of a plurality of sensor elements can be determined for each color for each position of the focal plane, i.e., for each focus position, the color of the probe light can be switched. The built-in time variation pattern provides a single color oscillating light signal and the signal amplitude at each sensor element can be determined for that color. When switching to the next color, the range can be determined again. When the range has been determined for all colors, the focus position is changed and the process is repeated. The color of the surface being scanned can then
Petition 870170040087, of 6/12/2017, p. 72/128
47/83 be obtained by combining and / or weighing the color information of a plurality of the sensor elements. For example, the color expressed as, for example, an RGB color coordinate of each surface element can be reconstructed by appropriately weighting the amplitude signal for each color corresponding to the maximum amplitude. This technique can also be applied when a static pattern is provided where the color of at least part of the pattern is time-varying.
[0134] To decrease the amount of data to be processed, the color resolution of the image can be chosen to be less than the spatial resolution. The color information is then provided by data interpolation. Thus, in an embodiment of the invention, the amplitude of the light signal of each of a plurality of sensor elements is determined for each color for selected full-color focal plane positions, and the amplitude of the light signal of each of a plurality of the sensor elements is determined for a color for each position of the focal plane. Then, the color of the surface to be scanned can be obtained by interpolating the color information from full color focal plane positions. In this way, for example, the amplitude is recorded for all colors in a range of N focus positions; while a color is selected to determine the range in all focus positions. N is a number that could be, for example, 3, 5 or 10. This results in a color resolution that is less than the resolution of the surface topology. This technique can also be applied when a static pattern is provided where the color of at least part of the pattern is variable over time.
[0135] Another embodiment of the invention does not register
Petition 870170040087, of 6/12/2017, p. 73/128
48/83 full color information and employs only two light sources with different colors. An example of this is a dental scanner that uses red and blue light to distinguish hard tissue (tooth) from soft tissue (gum).
Impression Scanning [0136] One embodiment of the invention is adapted to impression scanning, such as scanning dental impressions and / or ear canal impressions. Small Cavity Scanner [0137] The specific applications of the scanner according to the invention relate to the scanning of cavities, in particular body cavities. Scanning in cavities may be related to scanning objects in the cavity, such as scanning teeth in the mouth. However, scanning the ear, for example, is related to scanning the internal surface of the cavity itself. In general, scanning a cavity, especially a small cavity, requires some type of probe for the scanner. Thus, in an embodiment of the invention, the probe light emission point and the reflected light accumulation point are located in a probe, said probe being adapted to enter a cavity, such as a body cavity.
[0138] In another embodiment of the invention, the probe is adapted to scan at least a part of the surface of a cavity, such as an ear canal. The ability to scan at least part of the outside of the ear and / or the ear canal and make a virtual or real model of the ear is essential for the design of modern custom hearing aids (eg shell or mold). Currently, ear scanning is performed in a
Petition 870170040087, of 6/12/2017, p. 74/128
49/83 two-step process where a silicone impression of the ear is made first and the impression is later scanned using an external scanner in a second step.
[0139] Thus, an embodiment of the invention comprises a compartment that accommodates the camera, means for generating the pattern, means for varying the focus and means for processing data, and at least one probe accommodating a first optical system, preferably a substantially elongated probe.
[0140] Preferably, the point of emission of the probe light and the point of accumulation of light returned from the scanned object are located in said probe. The optical system in the probe is to transmit the probe light from the compartment towards the object and also to transmit and / or transform the image returned from the object back to the compartment where the camera is located. In this way, the optical system in the probe can comprise a lens system. In one embodiment of the invention, the probe may comprise at least one optical fiber and / or a bundle of fibers to transmit / carry / guide the probe light and / or the light returned from the object's surface. In this case, the optical fiber (s) can (s) act as an optical relay system that only carries light (that is, probe light and returned light) within the probe. In one embodiment of the invention, the probe is endoscopic. The probe can be rigid or flexible. The use of optical fiber (s) in the probe can, for example, provide a flexible probe with a small diameter.
[0141] In one embodiment of the invention, light is transmitted to the object and transformed into an image using only the optical system in the probe, the first optical system. However, in a further embodiment of the invention, the
Petition 870170040087, of 6/12/2017, p. 75/128
The 50/83 compartment may further comprise a second optical system.
[0142] In a further embodiment of the invention, the probe is detachable from the compartment. Preferably, a first probe light emission point and a first returned light accumulation point are located on the probe, and a second probe light emission point and a second returned light accumulation point are located in the compartment. This may require optical systems in both the housing and the probe. In this way, a scan can be obtained with the probe attached to the compartment. However, a scan can also be obtained with the probe decoupled from the compartment, that is, the compartment can be a stand-alone scanner in itself. For example, the probe can be adapted to be inserted and scan the interior of a cavity, whereas the compartment can be adapted to scan external surfaces. The probe coupling may include mechanical and / or electrical transfer between the housing and the probe. For example, the probe coupling can provide an electrical signal to the control electronics in the compartment that signals the current configuration of the device.
[0143] In one embodiment of the invention, the probe light is directed towards the object in a direction substantially parallel to the optical axis and / or the longitudinal axis of the probe. In another embodiment, the probe comprises a rear reflective element, such as a mirror, to direct the probe light in a direction other than the optical axis, preferably in a direction perpendicular to the optical axis. Applying a stand-alone scanner compartment with the probe to the above example
Petition 870170040087, of 6/12/2017, p. 76/128
51/83 uncoupled, the probe light can exit the compartment in a direction parallel to the optical axis of the optical system in the compartment (ie, the second optical system), while, with the probe attached, the probe light can be directed in a direction other than the optical axis of the probe's optical system (ie the first optical system). In this way, the probe is better adapted to scan a cavity.
[0144] In some embodiments of this invention, the residual heat generated in the scanner is used to heat the probe so that less or no condensation occurs on the probe when the probe is inside the body cavity, for example, in the mouth. Residual heat can, for example, be generated by the processing electronics, the light source and / or the mechanism that moves the focusing element.
[0145] In some embodiments of this invention, the scanner provides feedback to the user when recording subsequent scans for a larger model of the 3D surface fails. For example, the scanner may flash the light source.
[0146] In addition, the probe may comprise means for rotating / rotating the reflector element, preferably around an axis substantially parallel to the optical axis and / or the longitudinal axis of the probe. In this way, the probe can be adapted to provide a 360 ° scan around
optical axis and / or longitudinal axis the probe, in preferably, without rotation probe and / or scanner.[0147] In another implementation of invention, an plurality in different probes matches to
compartment. In this way, different probes adapted to different environments, surfaces, cavities etc. can be
Petition 870170040087, of 6/12/2017, p. 77/128
52/83 attached to the compartment to take into account different scanning situations. A specific example of this is when the scanner comprises a first probe adapted to scan the inside of a human ear and a second probe adapted to scan the outside of said human ear. Instead of a second probe, it can be the compartment itself, that is, with the uncoupled probe, which is adapted to scan the outside of said human ear, that is, the compartment can be adapted to perform a 3D surface scan. In other words: the compartment with the connected probe can be adapted to scan the inside of a human ear and the compartment with the uncoupled probe can be adapted to scan the outside of said human ear. Preferably, means for merging and / or combining 3D data for the inner and outer part of the ear are provided, thus providing a complete 3D model of a human ear.
[0148] For the portable embodiments of this invention, a pistol-like shape is ergonomic because the device rests comfortably within the operator's hand, with most of the mass resting on the hand and / or wrist. In this format, it is advantageous to be able to orient the aforementioned rear reflector in various positions. For example, it may be possible to rotate a probe with the rear reflector element, with or without the decoupling step of the main body of the scanning device. Detachable probes can also be autoclavable, which is definitely an advantage for scanners applied to humans, for example, as medical devices. For embodiments of this invention that perform an element of
Petition 870170040087, of 6/12/2017, p. 78/128
53/83 focus physically in motion by means of an engine, it is advantageous to place this engine in a pistol-shaped handle.
Use of Motion, Gravity and Magnetic Sensors [0149] The portable embodiments of the invention preferably include motion sensors, such as accelerometers and / or gyroscopes. Preferably, these motion sensors are small like microelectromechanical systems (MEMS) motion sensors. Preferably, motion sensors should measure all movement in 3D, that is, both translations and rotations for the three main coordinate axes. The benefits are:
A) Motion sensors can detect vibrations and / or agitation. The affected scans can be discarded or corrected by using image stabilization techniques.
B) Motion sensors can assist with linking and / or recording partial scans among themselves. This advantage is relevant when the field of view of the scanner is smaller than the object to be scanned. In this situation, the scanner is applied to small regions of the object (one at a time), which are then combined to obtain the complete scan. In the ideal case, motion sensors can provide the necessary transformation of relative rigid movement between local coordinates of partial scans, because they measure the relative position of the scanning device in each partial scan. Motion sensors with limited accuracy can also provide a first guess for a link / registration based on partial scanning software based, for example, on the Iterative Closest Point algorithm class, resulting in
Petition 870170040087, of 6/12/2017, p. 79/128
54/83 reduced computing time.
C) Motion sensors can be used (also) as a remote control for the software that comes with the invention. Such software, for example, can be used to view the acquired scan. With the scanning device now acting as a remote control, the user can, for example, rotate and / or pan the view (moving the remote control in the same way that the object on the computer screen must “move”). Especially in clinical applications, dual use of the portable scanner is preferable for hygiene considerations, as the operator avoids contamination from alternative manually operated input devices (touch screen, mouse, keyboard, etc.).
[0150] Even though it is very inaccurate to detect the translation movement, a 3-axis accelerometer can provide the direction of gravity in relation to the scanning device. Also, a magnetometer can provide directional information in relation to the scanning device, in this case, from the earth's magnetic field. Therefore, these devices can assist with the connection / registration and act as a remote control element.
[0151] The present invention relates to different aspects, including the scanning device described above and below, and corresponding methods, devices, uses and / or product means, each providing one or more of the benefits and advantages described in connection with the first aspect mentioned, and each having one or more embodiments corresponding to the embodiments described in connection with the first aspect mentioned and / or disclosed in the appended claims.
[0152] In particular, a method is disclosed here for
Petition 870170040087, of 6/12/2017, p. 80/128
55/83 obtain and / or measure the 3D geometry of at least a part of the surface of an object, said method comprising the steps of:
- generate a probe light that incorporates a spatial pattern,
- transmitting the probe light to the object along the optical axis of an optical system, thereby illuminating at least part of the object with the said pattern,
- transmit at least part of the light returned from the object to the camera,
- vary the position of the pattern's focus plane on the object while maintaining a fixed spatial relationship of the scanner and the object,
- obtain at least one image from that
matrix in elements sensors, - evaluate a measure correlation in each position in plan in focus enter at any less one pixel image and an occupation in weight in that occupation in weight is determined with
based on the configuration information of the spatial pattern;
- determine, by analyzing the correlation measure, the position (s) in focus of:
- each of a plurality of image pixels on the camera for said series of focus plane positions, or
- each of a plurality of groups of image pixels on the camera for said range of focus planes, and
- turn the data into focus into 3D real-world coordinates.
[0153] A computer program product comprising program code means for causing a data processing system to execute is also disclosed.
Petition 870170040087, of 6/12/2017, p. 81/128
56/83 the method, when said program code means are executed in the data processing system.
[0154] A computer program product is also disclosed, comprising a computer-readable medium storing the program code means.
[0155] Another aspect of the invention relates to a scanner to obtain and / or measure the 3D geometry of at least a part of the surface of an object, said scanner comprising:
- at least one camera that accommodates an array of sensor elements,
- means for generating a probe light,
- means for transmitting probe light to the object, thereby illuminating at least part of the object,
- means for transmitting the light returned from the object to the camera,
- means for varying the position of the focus plane on the object,
- means for obtaining at least one image of said sensor element array,
- means for:
a) determine the focus position (s) of:
- each of a plurality of sensor elements for a series of focus plane positions, or
- each of a plurality of groups of sensor elements for a series of focus plane positions, and
b) transform data in focus into 3D real-world coordinates;
wherein the scanner further comprises counterbalance means for counterbalancing the means for varying the position of the focus plane.
Petition 870170040087, of 6/12/2017, p. 82/128
57/83 [0156] A method for obtaining and / or measuring the 3D geometry of at least a part of the surface of an object is also disclosed, said method comprising the steps of:
- accommodate an array of sensor elements,
- generate a probe light,
- transmit the probe light towards the object, thereby illuminating at least part of the object,
- transmit the light returned from the object to the camera,
- vary the position of the focus plane on the object,
- obtain at least one image from said sensor element matrix,
- determine the focus position (s) of:
- each of a plurality of sensor elements
for series in plan positions in focus, or each one of a plurality in groups of elements sensors for an series of positions in focus plane, and - turning data into focus in coordinates of world
real 3D;
wherein the method further comprises counterbalancing the means to vary the position of the focus plane.
[0157] Another aspect of the invention relates to a portable 3D scanner with a handle at an angle of more than 30 degrees from the main optical axis of the scanner, for use in intraoral or intra-auricular scanning. Brief Description of the Drawings [0158] Figure 1: A schematic presentation of a first exemplary embodiment of the device according to the invention.
[0159] Figure 2: A schematic presentation of a second exemplary embodiment of the device according to the invention (optical correlation).
Petition 870170040087, of 6/12/2017, p. 83/128
58/83 [0160] Figure 3: Schematic presentations of exemplary embodiments of patterns according to the invention.
[0161] Figure 4: A schematic presentation of a first exemplary embodiment of a flat scanning tip with a long scanning length, using a plurality of mirrors (dichroic) and light sources.
[0162] Figure 5: - deleted -] [0163] Figure 6: A schematic presentation of a third exemplary embodiment of a flat scanning tip with a long scanning length, using a curved mirror.
[0164] Figure 7: A schematic presentation of a fourth exemplary embodiment of a flat scanning tip with a long scanning length, using a diffraction grating.
[0165] Figure 8: A schematic presentation of an exemplary embodiment of a mass-balanced focus lens scanner.
[0166] Figure 9: A schematic presentation of an exemplary embodiment of a device for simultaneously scanning the shape and color of a surface.
[0167] Figure 12: A schematic presentation of an exemplary embodiment of a device for scanning at least a part of the outside of the human ear and / or a part of the ear canal of a human ear.
[0168] Figure 13 (a) and (b): Schemes showing how a scanning embodiment can be used to both scan the outer and inner ear, respectively.
[0169] Figure 14: Outline of an implementation of
Petition 870170040087, of 6/12/2017, p. 84/128
59/83 scanner probe used to scan a narrow body cavity, such as a human ear.
[0170] Figure 15: Examples of mirror configurations to be used with a scanner probe.
[0171] Figure 16: A schematic representation of the reference signal values / weight values per pixel for a checkerboard pattern in an idealized optical system.
[0172] Figure 17: Illustration of the process of generating a fused reference signal, visualized as images.
[0173] Figure 18: Upper part: exemplary image with a projected pattern showing a human tooth. Bottom: The correlation measure for the series of focus lens positions in the group of pixels framed at the top of the figure.
[0174] Figure 19: Example of a fused correlation measurement image of an intraoral scene.
[0175] Figure 20: Example of a portable intraoral scanner with a pistol-like grip and a removable tip.
[0176] It will be understood that the ray traces and lenses illustrated in the figures are for illustrative purposes only and represent optical paths generally in the systems discussed. The traces of rays and lens shapes are not to be understood as limiting the scope of the invention in any sense including the magnitude, direction or focus of light rays or beams passing through various optical components, without prejudice to any variations in number, direction , shape, position or size thereof, except as expressly indicated in the detailed description below of the exemplary embodiments illustrated in the drawings.
Petition 870170040087, of 6/12/2017, p. 85/128
60/83
Detailed Description of the Drawings [0177] A functional portable 3D surface scanner should preferably have the following properties:
1) telecentricity in the space of the object to be scanned,
2) possibility of changing the focal plane while maintaining telecentricity and expansion,
3) simple focusing scheme that involves tuning optical components only on the device cable and not on the probe tip, and
4) an overall size consistent with a portable scanning device.
[0178] The scanner embodiment illustrated in Figure 1 is a portable scanner with all the components inside the compartment (head) 100. The scanner head comprises a tip that can be inserted into a cavity, a 110 light source, optical 120 to collect the light from the light source, standard generation medium 130, a beam splitter 140, an image sensor and electronic components 180, a lens system that transmits and transforms the light between the pattern, the object to be scanned and the image sensor (camera) 180. The light from the light source 110 moves back and forth through the optical system 150. During this passage, the optical system transforms the pattern 130 onto the object to be scanned 200 and still image the object to be scanned on the image sensor 181. The lens system includes a focusing element 151 that can be adjusted to shift the image's focal plane of the pattern in the probe object 200. One way to incorporate the focus element is to physically move a single lens element to
Petition 870170040087, of 6/12/2017, p. 86/128
61/83 back and forth along the optical axis. The device may include polarizing optics 160. The device may include folding optics 170 which directs light out of the device in a direction other than the optical axis of the lens system, for example, in a direction perpendicular to the optical axis of the lens system. lens. As a whole, the optical system provides an image formation of the pattern on the object to be scanned and the object to be scanned for the camera. One application of the device could be to determine the 3D structure of the teeth in the oral cavity. Another application could be to determine the 3D shape of the ear canal and the outside of the ear.
[0179] The optical axis in Figure 1 is the axis defined by a straight line through the light source 110, optical 120 and the lenses in the optical system 150. This also corresponds to the longitudinal axis of the scanner illustrated in Figure 1. The optical path it is the light path from light source 110 to object 220 and back to camera 180. The optical path can change direction, for example, through beam splitter 140 and folding optics 170.
[0180] The focus element is adjusted in such a way that the image of the pattern on the scanned object is shifted along the optical axis, preferably in equal steps from one end of the scanning region to the other. When the pattern is periodically varied over time to a fixed focus position, then the regions in focus on the object will exhibit a pattern that varies spatially. Out-of-focus regions will exhibit less or no contrast in light variation. The 3D surface structure of the probed object is determined by finding the plane corresponding to an extreme in the correlation measure for each sensor in the
Petition 870170040087, of 6/12/2017, p. 87/128
62/83 camera sensors or each group of sensors in the camera sensor array when recording the correlation measurement for a range of different focus positions 300. Preferably, it would be possible to move the focus position in equal steps from one end of the scanning region to the other.
Pattern Generation [0181] An embodiment of the pattern generation means is shown in Figure 3a: A transparent wheel with an opaque mask 133 in the form of spokes pointing radially from the center of the wheel. In this embodiment, the pattern is varied in time by rotating the wheel with a motor 131 connected to the wheel with, for example, a drive axle 132. The position of the pattern in time can be recorded during rotation. This can be achieved, for example, by using a position encoder on the edge of the pattern 134 or by obtaining the position of the shaft directly from the motor 131.
[0182] Figure 3b illustrates another embodiment of the pattern generation means: a segmented light source 135, preferably a segmented LED. In this embodiment, the LED surface is transformed into an image on the object under investigation. The individual LED segments 136 are switched on and off to provide a known variable time pattern in the object. The control electronics 137 of the time-varying pattern are connected to the segmented light source via electrical wires 138. The pattern is thus integrated into the light source and a separate light source is not required.
[0183] Figure 3c illustrates a static pattern, as applied in a spatial correlation embodiment of this invention. The checkerboard pattern shown is preferred because calculations for this regular pattern are easier.
Petition 870170040087, of 6/12/2017, p. 88/128
63/83
Time Correlation [0184] Figure 1 is also an exemplary illustration of the time correlation, in which an image of the pattern in and / or the object is formed on the camera. Each individual light sensing element in the camera sees a variation in the signal level corresponding to the variation in the lighting pattern on the object. The variation is periodic in the illustrative illustration. The light variation for each individual light sensing element will have a constant phase shift in relation to the position of the pattern.
[0185] The correlation measure can be obtained by recording n images on the camera during at least one period of oscillation, n is an integer greater than one. Registering the pattern position for each individual image combined with the phase shift values for each detection element and the recorded images allows an efficient extraction of the correlation measure on each individual detection element in the camera using the following formula, n
i = 1 [0186] Here Aj is the estimated correlation measure of detection element j, Ii r j, ... In r j are the n recorded signals of detection element j, fi r j, ... f n , j are the n reference signal values obtained from knowing the pattern configuration for each image recording, f has two indices i, j. The variation of f with the first index is derived from knowing the pattern position during each image recording. The variation of f with the second index is derived from the knowledge of the pattern geometry that can be determined before the 3D scan.
Petition 870170040087, of 6/12/2017, p. 89/128
64/83 [0187] The focus position corresponding to the pattern in focus on the object for a single sensor in the camera will be given by one end in the recorded correlation measure of that sensor when the focus position is varied over a range of values, preferably , in equal steps from one end of the scanning region to the other.
Spatial Correlation [0188] In an example of the spatial correlation scheme, an image of the object with a projected guadricated pattern is registered with the high resolution allowed by the image sensor. The spatial correlation grid must then analyze groups of pixels in the recorded image and extract the correlation measure in the pattern. An extreme in the obtained correlation measures indicates the position in focus. To simplify, you can use a framed pattern with a period corresponding to n = N x N pixels in the sensor and then analyze the correlation measure within a period of the pattern (in general, the pattern does not have to be N x N) . In the best case, it will be possible to align the pattern so that the edges of the chessboard coincide with the pixel edges, but the scanning principle is not based on this. Figure 16 shows this for the case n = 4 x 4 = 16. For a sensor with W x H = 1024 x 512 pixels, this would correspond to obtaining 256 x 128 points of measurement of correlation of an image. The extraction of the correlation measure Aj within an N x N group of pixels with label j is given by n
i = l in which fj =. .fn, j) is the reference signal vector obtained from the knowledge of the pattern configuration, and Ij =
Petition 870170040087, of 6/12/2017, p. 90/128
65/83 (li, j,... In, j) is an input signal vector.
[0189] To suppress any part of CC in the light, we prefer for everyone as:
n ° = ςλ i = l [0190] For the situation represented in Figure 16, for example, fi, j = -1 for the pixels corresponding to the dark parts of the pattern, and fi, j = +1 otherwise. If the edge of the pattern is not aligned with the edges of the pixels, or if the optical system is not perfect (and, therefore, in all practical applications), then fi, j would assume values between -1 and +1 for some i. A detailed description of how to determine the reference function is provided later.
Optical Correlation [0191] An example of the optical correlation shown in Figure 2. In this embodiment, an image is formed on camera 180 which is an overlay of pattern 130 with the probed object 200. In this embodiment, the pattern is transmissive in nature where light it is transmitted through the pattern and the image of the pattern is projected on the object and vice versa. In particular, this involves the retransmission of light through the pattern in the opposite direction. An image of the pattern on the camera is then formed with the help of a beam splitter 140. The result of this arrangement is an image to be formed on the camera which is an overlay of the pattern itself and the object to be scanned. A different way of expressing this is that the image on the camera is substantially a multiplication of an image of the pattern projected onto the object with the pattern itself.
[0192] The variation is periodic in the illustration
Petition 870170040087, of 6/12/2017, p. 91/128
Exemplary 66/83. The correlation measure between the variation of light in the object and the pattern for a given focusing distance can be obtained by time by integrating the camera signal during a large number of oscillation periods so that the exact synchronization of the pattern oscillation time and camera integration time is not important. The focus position corresponding to the pattern in focus on the object for a single sensor in the camera will be given by the maximum signal value recorded from that sensor when the focus position is varied over a range of values, preferably in equal steps from one scan region to the other.
Finding the Predetermined Reference Function [0193] Next, the process for calculating the reference signal f is described for a spatial correlation embodiment of this invention, and illustrated in a stylized manner in Figure 17.
[0194] The process begins by recording a series of images of the checkerboard pattern as projected, for example, on a flat surface, preferably oriented orthogonally to the optical axis of the scanner. The images are taken at different positions of the focusing element, effectively covering the entire path of said focusing element. Preferably, images are taken at equidistant locations.
[0195] Since the focus plane is generally not a geometric plane, different regions of the plane surface will be focused on different images. Examples of three of these images are shown in Figures 17a-17c, where 1700 is a region in focus. Note that in this stylized figure, the transitions between regions in and out of focus,
Petition 870170040087, of 6/12/2017, p. 92/128
67/83 respectively, are exaggerated in order to demonstrate the principle more clearly. In addition, in general, there will be many more images than just the three used in this simple example.
[0196] The regions in focus within an image are found as those of maximum intensity variance (indicating maximum contrast) across that series of images. The region to calculate the variance does not have to be the same as the size of the pixel group used in the spatial correlation, but it must be large enough to contain both the dark and light regions of the pattern and must be the same for all images in the series.
[0197] Finally, a “fused image (Figure 17d) is generated by combining all the regions in focus of the series (17a-17c). Note that in real applications, the merged image will generally not be a perfect black and white chessboard, but will include intermediate gray values caused by an imperfect optical system and a chessboard that is not perfectly aligned with the sensors of the camera. An example of part of an actual rendered image is shown in Figure 17e.
[0198] The pixel intensities within this image can be interpreted as an “image in weight with the same dimensions as the original image of the pattern. In other words, the pixel values can be interpreted as the reference signal and the reference vector / weight value set fj = (fi, j, ... fn, j) for the n pixels in the pixel group with index j can be found from the pixel values.
[0199] For convenience in implementing calculations, especially when performed on an FPGA, the image
Petition 870170040087, of 6/12/2017, p. 93/128
Fused 68/83 can be subdivided into pixel groups. The CC portion of the signal can then be removed by subtracting the average intensity within the group from each pixel intensity value. In addition, it can then be normalized by dividing by the standard deviation within the group. The weight values thus processed are an alternative description of the reference signal.
[0200] Due to the periodic nature of the “fused image and, therefore, the“ weighted image, the latter can be compressed efficiently, thus minimizing the memory requirements on the electronic components that can implement the algorithm described here. For example, the PNG algorithm can be used for compression.
The Correlation Image [0201] A “correlation image is generated based on the“ fused image and the set of images recorded with the camera during a scan. For spatial correlation based on an N x N grid pattern, remember that the correlation measure within the group is λ. _ yNxNf. j.
~ 2-n = l Jí.j 1 i, j>
where fj =. .fn, j) are fused image values and Ij = (li, j,.. In, j) are values of an image recorded on the camera. The pixel groupings used in any CC removal and possibly the normalization that produced the merged image are the same as in the calculation above. For each image recorded by the scanner when scanning the focus element, there will therefore be an array of values (H / N) x (W / N) of A. This matrix can be viewed as an image.
[0202] Figure 18 (upper section) shows an example of a correlation measure image, here of part of a human tooth and its border. A 6x6 pixel pixel group is marked
Petition 870170040087, of 6/12/2017, p. 94/128
69/83 by a 1801. square. For this example of group of pixels, the series of correlation measures A over all images within a scan of the focusing element is shown in the graph in the lower section of Figure 18 (cross hairs). The x-axis on the graph is the position of the focusing element, while the y-axis shows the magnitude of A. Running a simple Gaussian filter on the raw series results in a smoothed series (solid line). In the figure, the focus element is in the position that provides the ideal focus for the example pixel group. This fact is subjectively visible in the image, but also determined quantitatively as the maximum of the series of A. The vertical line 1802 in the lower section of Figure 18 indicates the location of the global end and, therefore, the position in focus. Note that, in this example, the location of the maximums in the smoothed and gross series, respectively, is visually indistinguishable. In principle, however, it is possible and also advantageous to find the maximum location from the smoothed series, since it can be between two lens positions and thus provide greater precision.
[0203] The matrix of values of A can be calculated for each image recorded in a scan of the focus element. By combining the global extremes (in all images) of A in all groups of pixels in the same way that the merged image was combined, a pseudo-image of dimension (H / N) x (W / N) can be obtained. This is called a “fused correlation image. An example of a fused correlation image of some teeth and gums is shown in Figure 19. As can be seen, it is useful for visualization purposes.
Increased Field of View
Petition 870170040087, of 6/12/2017, p. 95/128
70/83 [0204] For the scanning head to be inserted in a small space, such as a patient's oral cavity, the tip of the scanning head must be small enough. At the same time, the light coming out of the scanning head must leave the scanning head in a direction other than the optical axis, for example, in a direction perpendicular to the optical axis. In one embodiment of the invention, a 45 degree mirror is used as a folding optic 170 directing the light path towards the object. In this embodiment, the height of the scanning tip must be at least as large as the scanning length.
[0205] Another embodiment of the invention is shown in Figure 4. This embodiment of the invention allows a scanning tip with a lower height (indicated by b in the figure) than the scanning length (designated by a in the figure). Light from two sources 110 and 111 that emits light of different colors / wavelengths is fused together using a suitably coated plate (for example, a dichroic filter) 112 that transmits light from 110 and reflects light from 111. At the scanning tip, a properly coated plate (for example, a dichroic filter) 171 reflects light from one source to the object and transmits the light from the other source to a mirror at the end of the scanning tip 172. During scanning, the focus position is moved in such a way that the 110 light is used to project an image of the pattern to a lower position 171 while 111 is turned off. The 3D surface of the object in the lower region 171 is recorded. Then 110 is turned off and 111 is turned on and the focus position is moved in such a way that the 111 light is used to project a
Petition 870170040087, of 6/12/2017, p. 96/128
71/83 image of the pattern to a lower position 172. The 3D surface of the object in the lower region 172 is recorded. The region covered with 110 and 111 light, respectively, may partially overlap.
[0206] Another embodiment of the invention that allows a scanning tip with a lower height (designated by b in the figure) than the scanning length (designated by a in the figure) is shown in Figure 6. In this embodiment, the folding optics 170 comprises a curved fold mirror 173 that can be supplemented with one or two lens elements 175 and 176 with surfaces that can be aspherical to provide corrected optical image formation.
[0207] Another embodiment of the invention that allows a scanning tip with a lower height (designated by b in the figure) than the scanning length (designated by a in the figure) is shown in Figure 7. In this embodiment, the folding optics 170 comprises a grid 177 that provides anamorphic magnification, so that the image of the pattern on the object that is to be scanned is stretched. The grid can be opened. The light source 110 needs to be monochromatic or semi-monochromatic in this embodiment.
High Spatial Contrast Range of Pattern Projected on Difficult Objects [0208] High spatial contrast of the pattern image in focus on the object is required to obtain a high correlation measure signal based on camera images. This, in turn, is necessary to obtain a good estimate of the focus position corresponding to the position of one end of the correlation measure. This necessary condition for successful scanning is easily achieved on objects
Petition 870170040087, of 6/12/2017, p. 97/128
72/83 with a diffuse surface and negligible light penetration. However, for some objects, it is difficult to achieve high spatial variation or, more generally, variation.
[0209] A difficult type of object, for example, is an object that has multiple dispersion with a long light diffusion length compared to the smaller characteristic dimension of the spatial pattern transformed into an image on the object. A human tooth is an example of such an object. The human ear and the ear canal are other examples. The improved spatial variation in these objects can be achieved by preferential image formation of the specular surface reflection of the object in the camera. One embodiment of the invention applies the polarization engineering shown in Figure 1. In this embodiment, beam splitter 140 is a polarization beam divider that transmits, respectively, reflects two orthogonal polarization states, for example, S and P polarization states The light transmitted through the lens system 150 is, therefore, of a specific polarization state. Before exiting the device, the polarization state is changed with a delay plate 160. A preferred type of delay plate is a quarter wave delay plate. A linearly polarized light wave is transformed into a circularly polarized light wave when passing a quarter-wave plate with a 45-degree orientation of its fast axis to the linear polarization direction. A specular reflection of the object has the property of inverting the helicality of a circularly polarized light wave. After the passage of the quarter-wave delay plate by the specularly reflected light, the polarization state becomes
Petition 870170040087, of 6/12/2017, p. 98/128
73/83 orthogonal to the incident state on the object. For example, a polarization state S that propagates in the downstream direction towards the object will be returned as a polarization state P. This implies that the specularly reflected light wave will be directed to the image sensor 181 in beam splitter 140 The light that enters the object and is reflected by one or more scattering events becomes depolarized and half of this light will be directed to the image sensor 181 by beam splitter 140.
[0210] Another type of difficult object is an object with a shiny or metallic-looking surface. This is particularly true for a polished object or an object with a very smooth surface. A piece of jewelry is an example of such an object. Even very smooth and shiny objects, however, exhibit a diffuse amount of reflection. The improved spatial contrast in these objects can be achieved by the preferential image of the reflection of the diffuse surface from the object in the camera. In this embodiment, beam splitter 140 is a polarizing beam splitter that transmits, respectively, reflects two orthogonal polarization states, for example, polarization states S and P. The light transmitted through the lens system 150 is thus of a specific polarization state. A diffuse reflection of the object has the property of losing its polarization. This implies that half of the light wave reflected in a diffuse way will be directed to the image sensor 181 in the beam splitter 140. The light that enters the object and is reflected by specular polarization preserves its polarization state, so none of them will be directed to image sensor 181 by beam splitter 140.
Reduction of Agitation Caused by the Focusing Element
Petition 870170040087, of 6/12/2017, p. 99/128
74/83 [0211] During scanning, the focus position is changed over a range of values, preferably provided by a focusing element 151 in the optical system 150. Figure 8 illustrates an example of how to reduce the agitation caused by the element oscillating focus. The focusing element is a lens element 152 that is mounted on a translation stage 153 and is moved back and forth along the optical axis of said optical system with a mechanical mechanism 154 that includes a motor 155. During scanning , the center of mass of the handheld device is shifted due to the physical movement of the lens and support element. This results in an undesirable shaking of the handheld while scanning. The situation is aggravated if the scan is fast, for example, a scan time of less than one second. In an implementation of the invention, displacement of the center of mass is eliminated by moving a counterweight 156 in a direction opposite to the lens element, such that the center of mass of the portable device remains fixed. In the preferred implementation, the focusing lens and counterweight are mechanically linked and their opposite movement is driven by the same engine.
Color Measurement [0212] An embodiment of a 3D color scanner is shown in Figure 9. Three light sources 110, 111 and 113 emit red, green and blue light. The light sources can be LEDs or lasers. The light is fused together to overlap or essentially overlap. This can be achieved by using two suitably coated plates 112 and 114. Plate 112 transmits light from 110 and reflects light from 111. Plate 114 transmits light from 110 and 111 and
Petition 870170040087, of 6/12/2017, p. 100/128
75/83 reflects light of 113. Color measurement is performed as follows: for a given focus position, the amplitude of the time-varying pattern projected on the probed object is determined for each sensor element in sensor 181 by one of the methods above for each of the light sources individually. In the preferred embodiment, only one light source is turned on at a time, and the light sources are turned on after rotation. In this embodiment, the optical system 150 can be achromatic. After determining the amplitude for each light source, the focus position is moved to the next position and the process is repeated. The color expressed, for example, as an RGB color coordinate of each surface element can be reconstructed by appropriately weighting the amplitude signal for each color corresponding to the maximum amplitude.
[0213] A specific embodiment of the invention records only the amplitude for all colors in a range of P focus positions; while a color is selected for determining the range in all focus positions. P is a number that can be, for example, 3, 5 or 10. This results in a color resolution that is less than the resolution of the surface topology. The color of each surface element of the scanned object is determined by interpolation between the focus positions where full color information is obtained. This is in analogy to the Bayer color scheme used in many digital color cameras. In this scheme, the color resolution is also less than the spatial resolution and the color information needs to be interpolated.
[0214] A simpler embodiment of the 3D color scanner does not record full color information and employs
Petition 870170040087, of 6/12/2017, p. 101/128
76/83 only two light sources with different colors. An example of this is a dental scanner that uses red and blue light to distinguish hard tissue (tooth) from soft tissue (gum).
Ear Scanner Realization [0215] Figures 12 to 15 schematically illustrate an embodiment of a scanner based on time-varying structured light illumination for direct scanning of human ears by scanning both the outside (outside) and the inside (inside) of a human ear by using an external common scanner cable and a detachable probe. This embodiment is advantageous in that it allows a non-invasive scan using a probe designed to be inserted into small cavities, such as a human ear. This is done, in part, by placing the bulky and essential parts of the scanner, such as the scanner camera, the light source, electronic components and focus optics outside the narrowly confined part of the ear canal.
[0216] The ability to scan the outside and inside of human ears and make a virtual or real model of the ear is essential in the design of a customized adapted hearing aid (for example, a shell or ear mold). Currently, ear scanning is performed in a two-step process where a silicone impression of the ear is taken first and the impression is later scanned using an external scanner in a second step. The printing process has several disadvantages that will be briefly described below. A major drawback is the frequent poor quality prints taken by qualified clinical professionals due to the necessary preparation and techniques. May arise
Petition 870170040087, of 6/12/2017, p. 102/128
77/83 inaccuracies because it is known that the impression material expands during hardening and that deformation and fracture in the impression are often created when the impression is removed from the ear. Another disadvantage is related to the health risks involved with making an impression due to irritation and allergic responses, damage to the tympanic membrane and infections. Finally, the impression process is an uncomfortable experience for many patients, especially for young children, who often require impressions made at regular intervals (for example, every four months) to accommodate the varying dimensions of the ear canal. In summary, these disadvantages can be overcome if it is possible to scan the outer and inner ear in a non-invasive manner and obtain a record between the inner and outer ear surfaces.
[0217] The following is not restricted to scanning the ear, but can be used to scan any small body cavity. Figure 12 is a schematic of an embodiment of such a scanner. The scanner consists of two main parts - an external part of the scanner 1001 and a scanner probe 1002. The external part of the scanner can be used without the probe to obtain a larger field of view, necessary, for example, to scan the external part from ear 1102, or the first part of the ear canal to the first curvature. The large field of view on the outside of the scanner is important to obtain a good record between individual subscans and high overall accuracy. When connecting a 1202 scanner probe to the outside of the 1201 scanner, the combined scanner allows you to scan small, curved cavity surfaces, such as the
Petition 870170040087, of 6/12/2017, p. 103/128
78/83 internal of an ear 1203. In this way and using the same system, the external part of the scanner and the combined probe are able to scan larger external areas together with smaller internal areas. In Figure 12, the outside of scanner embodiment 1001 consists of a diverging light source 1003 (laser, LED, tungsten or other type) that is collimated using collimation optics 1004. Collimated light is used to illuminate a transparent object 1005 (for example glass) with an opaque pattern, for example, fringes on it. The pattern is subsequently transformed into an image on the object to be scanned using a suitable optical system. The pattern is observed using a similar optical system and a 1006 camera, in which the camera is positioned outside the cavity. The 3D information is obtained from the 2D images observing the oscillation of light created by the movement of the pattern along the scanning object, as contained in the individual pixel amplitude.
[0218] To facilitate the movement of the pattern, the fringe pattern 1005 is rotatable in one embodiment. In another embodiment, the fringe pattern is positioned on a translation plate that moves in a plane perpendicular to the optical axis with a certain oscillation frequency. The light to and from the scanning object is projected through a beam splitter arrangement 1007, which consists of a prism cube in one embodiment and, in another embodiment, consists of an angled plate or membrane. The beam splitter serves to transmit the source light further down the system, while at the same time guiding the reflected light from the scanning object back to the camera, which is positioned on an axis perpendicular to the axis of the light source and divider. beam.
Petition 870170040087, of 6/12/2017, p. 104/128
79/83 [0219] To move the focus plane, the outside of the scanner includes a focusing optic, which, in one embodiment, consists of a single mobile lens 1008. The purpose of the focusing optic is to facilitate the movement of the focus plane for the entire image formation system in the required scanning range and along the optical axis. In one embodiment, the focusing optics on the outside of scanner 1101 includes a lens that can focus the light directly, without using any additional optics, as shown in Figure 13a. In another embodiment, the outside of the scanner is provided with a wide-angle lens designed with a large field of view, for example, large enough to scan the outside of a human ear 1102.
[0220] The optical part of the scanning probe consists of an endoscopic optical relay system 1009 followed by a 1010 probe objective, both of which are small enough to fit in the channel of a human ear. These optical systems can consist of a plurality of optical fibers and lenses and serve to transport and focus the light from the outside of the scanner to the 1014 scanning object (for example, the inner surface of an ear), as well as to collimate and transport the reflected light from the scanning object back to the outside of the scanner. In one embodiment, the probe objective provides a telecentric projection of the fringe pattern onto the scanning object. Telecentric projection can significantly facilitate the mapping of data from 2D images acquired to 3D images. In another embodiment, the main rays (central ray of each beam of rays) of the probe lens are divergent (not
Petition 870170040087, of 6/12/2017, p. 105/128
80/83 telecentric) to give the camera an angle of view greater than zero, as shown in Figure 13a.
[0221] The position of the focus plane is controlled by the focusing optics 1008 and can be moved within a range large enough to at least match the scanning surface 1014. A single subscan is achieved by collecting a variety of 2D images in different positions the focus plane and in different positions of the fringe pattern, as previously described. As the focus plane coincides with the scanning surface in a single pixel position, the fringe pattern will be projected on the surface point in focus and with high contrast, thus giving rise to a wide variation, or amplitude, of the value of pixel over time. For each pixel, it is thus possible to identify individual settings of the focusing optics for which each pixel will be in focus. Using the knowledge of the optical system, it is possible to transform the contrast information versus the position of the focus plane into 3D surface information, on an individual pixel basis.
[0222] In one embodiment, a mirror arrangement 1011, consisting of a single reflecting mirror, or prism, or mirror arrangement, is located after the 1010 probe lens. This arrangement serves to reflect the rays in a direction of different view than that of the probe axis. Different exemplary mirror arrangements are shown in Figures 15a-15d. In a particular embodiment, the angle between the normal mirror and the optical axis is approximately 45 degrees, thus providing a 90 degree view in relation to the probe axis - an ideal arrangement for looking at rounded corners. A transparent 1012 window is positioned adjacent to the mirror and as
Petition 870170040087, of 6/12/2017, p. 106/128
81/83 part of the probe compartment / shell, to allow light to pass between the probe and the scanned object, keeping the optics clean from external dirt particles.
[0223] To reduce the probe movement required by a scanner operator, the mirror arrangement can be rotated using a 1013 motor. In one embodiment, the mirror arrangement rotates at a constant speed. Through the complete rotation of a single mirror, it is thus possible to scan with a 360 degree coverage around the probe axis without physically moving the probe. In this case, the probe window 1012 is necessary to surround / surround the probe to allow viewing from all angles. In another embodiment, the mirror rotates with a certain frequency of rotation oscillation. In yet another embodiment, the inclination of the mirror arrangement in relation to the probe axis varies with a certain oscillation frequency.
[0224] A particular embodiment uses a double mirror instead of a single mirror (Figures 15b and 15d). In a special case, two mirrors angled approximately 90 degrees from each other are normal. The use of a double mirror helps to record the individual subscans, since the information of two opposite surfaces in this way is obtained at the same time. Another benefit of using a double mirror is that only 180 degrees of mirror rotation is required to scan a total of 360 degrees. A scanner solution that employs double mirrors can therefore provide 360 degree coverage in less time than single mirror configurations.
Pistol Handle [0225] Figure 20 shows an embodiment of the scanner with a 2001 pistol handle. This shape is
Petition 870170040087, of 6/12/2017, p. 107/128
82/83 particularly ergonomic. The scanner in Figure 20 is designed for intraoral scanning of teeth. The 2002 tip can be removed from the main body of the scanner and can be autoclaved. In addition, the tip can have two positions in relation to the main body of the scanner, namely looking downwards (as in Figure 20) and looking upwards. Therefore, scanning a patient's upper and lower mouth is equally comfortable for the operator. Note that the scanner shown in Figure 20 is an initial prototype with several cables attached for testing purposes only.
[0226] Although some embodiments have been described and illustrated in detail, the invention is not restricted to them, but can also be realized in other ways within the scope of the subject defined in the following claims. In particular, it should be understood that other embodiments can be used and that structural and functional modifications can be made without departing from the scope of the present invention.
[0227] In device claims that enumerate various media, several of these media may be incorporated by one and the same piece of hardware. The mere fact that certain measures are described in mutually different dependent claims or described in different embodiments does not indicate that a combination of these measures cannot be used to advantage.
[0228] It should be noted that the term “comprises / comprising, when used in this specification, must specify the presence of characteristics, whole numbers, steps or components presented, but does not prevent the presence or addition of one or more characteristics, whole numbers, steps, components
Petition 870170040087, of 6/12/2017, p. 108/128
83/83 or groups of these.
The characteristics of the method described above and below can be implemented in software and executed in a data processing system or other processing medium generated by executing instructions executable by a computer. Instructions can be means of program code loaded into memory, such as RAM, from a storage medium or from another computer over a computer network. Alternatively, the features described can be implemented by connected circuits instead of software or in combination with software.
权利要求:
Claims (15)
[1]
1. Scanner to obtain and / or measure the three-dimensional geometry of at least a part of the surface of an object, in which said scanner comprises:
- at least one camera (180) accommodating an array of sensor elements,
- means for generating (110, 120, 130) a probe light incorporating a spatial pattern,
- means for transmitting (140, 150, 170) the probe's light towards the object, thus illuminating at least part of the object with the said pattern in one or more configurations,
- means for transmitting (140, 150, 170) at least part of the light returned from the object to the camera (180),
- means for varying (151) the position of the pattern's focus plane on the object, maintaining a fixed spatial relationship of the scanner and the object,
- means for obtaining at least one image from said matrix of sensor elements, characterized by the fact that the scanner also comprises:
- means for evaluating a correlation measure at each position in the focus plane between at least a group of image pixels and a weight function, where the weight function is determined based on information from the spatial pattern configuration; and
- means for data processing for:
a) determine, by analyzing the correlation measure, the focus position of:
Petition 870190091060, of 9/13/2019, p. 29/41
[2]
2/5 each of a plurality of groups of pixels in the image for a series of focus plane positions, and
b) transform data into focus in three-dimensional coordinates of the real world.
2. Scanner according to claim 1, characterized by the fact that the means to evaluate a correlation measure is a means of data processing.
[3]
3. Scanner according to claim 1 or 2, characterized by the fact that the position in focus for said group of pixels is determined as one at least local extreme position of an optionally smoothed series of computed correlation measures for a plurality of said focus plane positions.
[4]
4. Scanner according to claim 3, characterized by the fact that the correlation measure for a focus plane position is calculated as a scalar product and each scalar product is calculated from a signal vector with more than one element representing sensor signals and a weight vector of the same length as said weight signal vector.
[5]
5. Scanner according to any one of claims 1 to 4, characterized by the fact that the pattern is static.
6. Scanner, according to any of the claims 1 to 5, characterized by the fact that the
said pattern has translational and / or rotational periodicity.
[6]
7. Scanner according to any one of claims 1 to 6, characterized by the fact that the plan
Petition 870190091060, of 9/13/2019, p. 30/41
3/5 camera focus (180) is adapted to be moved synchronously with the standard focus plane.
[7]
8. Scanner according to any one of claims 1 to 7, characterized by the fact that the object is an anatomical object, such as an ear canal, or a dental object, such as teeth.
[8]
Scanner according to any one of claims 1 to 8, characterized in that it still comprises at least one beam splitter (140) located in the optical path, such as a polarization beam divider.
[9]
10. Scanner according to any one of claims 1 to 9, characterized in that the sensor signal is an integrated light intensity substantially reflected from the object's surface.
[10]
11. Scanner according to any one of claims 1 to 10, characterized in that the position of the focus plane is periodically varied with a predefined frequency.
[11]
12. Scanner according to any one of claims 1 to 11, characterized in that the pattern is a static line pattern or a static checkerboard pattern.
[12]
13. Scanner according to any one of claims 1 to 12, characterized by the fact that the sensor element matrix is divided into groups of sensor elements, preferably rectangular groups, such as groups of squares of sensor elements, preferably adjacent sensor elements.
Petition 870190091060, of 9/13/2019, p. 31/41
4/5
14. Scanner, from wake up with any an of claims 1 to 13, characterized by the fact that what fur least one spatial period of the pattern corresponds to one group of sensor elements. 15. Scanner, from wake up with any an of claims 1 to 14, characterized by the fact that what still comprises means for polarize the probe light, as a polarizing element. 16. Scanner, from wake up with any an of
claims 1 to 15, characterized by the fact that it still comprises a delay plate (160) and a linear polarizing element, located in the optical path, a delay plate, such as a quarter wave delay plate.
[13]
17. Scanner according to any one of claims 1 to 16, characterized in that the scanner is adapted to be portable, and where the scanner comprises one or more built-in motion sensors that produce data for the combination of at least two partial scans for a three-dimensional model of an object's surface, where motion sensor data is potentially used as a first guess for a better match found by software.
[14]
18. Scanner according to any one of claims 1 to 17, characterized by the fact that the scanner is adapted to be portable and where the scanner comprises one or more built-in motion sensors that produce the data to interact with the user interface some software related to the scanning process.
Petition 870190091060, of 9/13/2019, p. 32/41
5/5
[15]
19. Method for obtaining and / or measuring the three-dimensional geometry of at least part of the surface of an object, in which the referred method comprises the steps of:
- generate a probe light incorporating a spatial pattern,
- transmitting the light from the probe to the object along the optical axis of an optical system (150), thus illuminating at least a part of the object with the said pattern,
- transmit at least part of the light returned from the object to the camera (180),
- vary the position of the pattern's focus plane on the object, maintaining a fixed spatial relationship of the scanner and the object,
- obtain at least one image of the said set of sensor elements, characterized by the fact that the method also comprises:
- evaluate a correlation measure at each position of the focus plane between at least one group of image pixels and a weight function, where the weight function is determined based on information from the spatial pattern configuration;
- determine, by analyzing the correlation measure, the focus position (s) of:
each of a plurality of groups of pixels of the image on the camera for said series of focus planes, and
- transform data into focus in real-world three-dimensional coordinates.
类似技术:
公开号 | 公开日 | 专利标题
US11051002B2|2021-06-29|Focus scanning apparatus
US9939258B2|2018-04-10|Confocal surface topography measurement with fixed focal positions
ES2684135T3|2018-10-01|Cavity scanning with restricted accessibility
JP4822454B2|2011-11-24|Dental optical coherence tomography system
CA2949448A1|2016-01-14|Apparatus for dental confocal imaging
US7965392B2|2011-06-21|Optical coherence tomography device and measuring head
US9638511B2|2017-05-02|Smart phone attachment for 3-D optical coherence tomography imaging
FR2960962A1|2011-12-09|DEVICE FOR THREE DIMENSIONAL AND TEMPORAL MEASUREMENTS BY COLOR OPTICAL FOOTPRINT.
Das et al.2015|A compact structured light based otoscope for three dimensional imaging of the tympanic membrane
US20220086418A1|2022-03-17|Focus scanning apparatus
Abreu de Souza et al.2012|A photogrammetric technique for acquiring accurate head surfaces of newborn infants for optical tomography under clinical conditions
Merman et al.2010|Imaging Acoustic Vibrations Using Spectrally Encoded Interferometry
同族专利:
公开号 | 公开日
ES2607052T3|2017-03-29|
US20180255293A1|2018-09-06|
AU2010262191A1|2011-12-08|
CA2763826A1|2010-12-23|
US10326982B2|2019-06-18|
AU2015205898A1|2015-08-20|
BR112012000189A2|2018-02-06|
US20150054922A1|2015-02-26|
US8878905B2|2014-11-04|
CN104783757A|2015-07-22|
WO2010145669A1|2010-12-23|
US20200169722A1|2020-05-28|
BR112012000189A8|2018-05-29|
CN102802520B|2015-04-01|
EP2442720B1|2016-08-24|
US10349041B2|2019-07-09|
DK2442720T3|2016-12-19|
JP2015083978A|2015-04-30|
JP2012530267A|2012-11-29|
EP2442720A1|2012-04-25|
US20210306617A1|2021-09-30|
US11076146B1|2021-07-27|
US20190124323A1|2019-04-25|
CN104783757B|2018-01-05|
US10097815B2|2018-10-09|
CN102802520A|2012-11-28|
CA2763826C|2020-04-07|
AU2010262191B2|2015-04-23|
US20120092461A1|2012-04-19|
US10349042B1|2019-07-09|
JP5654583B2|2015-01-14|
US20210211638A1|2021-07-08|
US20190200006A1|2019-06-27|
US20190289283A1|2019-09-19|
US11051002B2|2021-06-29|
US10595010B2|2020-03-17|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

DE2216760C2|1972-04-07|1982-11-11|Hawera Probst Gmbh + Co, 7980 Ravensburg|Rock drill|
US3971065A|1975-03-05|1976-07-20|Eastman Kodak Company|Color imaging array|
US4349880A|1979-03-19|1982-09-14|Rca Corporation|Inspection system for detecting defects in regular patterns|
US4291958A|1980-10-03|1981-09-29|Eastman Kodak Company|Camera with electronic flash and piezoelectric lens motor|
US4575805A|1980-12-24|1986-03-11|Moermann Werner H|Method and apparatus for the fabrication of custom-shaped implants|
US4516231A|1982-08-26|1985-05-07|Rca Corporation|Optical disc system having momentum compensation|
JPS6015834A|1983-07-05|1985-01-26|Mitsubishi Electric Corp|Focus controller|
US4629324A|1983-12-29|1986-12-16|Robotic Vision Systems, Inc.|Arrangement for measuring depth based on lens focusing|
US4640620A|1983-12-29|1987-02-03|Robotic Vision Systems, Inc.|Arrangement for rapid depth measurement using lens focusing|
JPS62100716A|1985-10-29|1987-05-11|Matsushita Electric Ind Co Ltd|Photographing device|
CH672722A5|1986-06-24|1989-12-29|Marco Brandestini|
US4896015A|1988-07-29|1990-01-23|Refractive Laser Research & Development Program, Ltd.|Laser delivery system|
US5372502A|1988-09-02|1994-12-13|Kaltenbach & Voight Gmbh & Co.|Optical probe and method for the three-dimensional surveying of teeth|
US5269325A|1989-05-26|1993-12-14|Biomagnetic Technologies, Inc.|Analysis of biological signals using data from arrays of sensors|
JP2928548B2|1989-08-02|1999-08-03|株式会社日立製作所|Three-dimensional shape detection method and device|
US5181181A|1990-09-27|1993-01-19|Triton Technologies, Inc.|Computer apparatus input device for three-dimensional information|
GB9102903D0|1991-02-12|1991-03-27|Oxford Sensor Tech|An optical sensor|
US5162641A|1991-02-19|1992-11-10|Phoenix Laser Systems, Inc.|System and method for detecting, correcting and measuring depth movement of target tissue in a laser surgical system|
US5131844A|1991-04-08|1992-07-21|Foster-Miller, Inc.|Contact digitizer, particularly for dental applications|
US6485413B1|1991-04-29|2002-11-26|The General Hospital Corporation|Methods and apparatus for forward-directed optical scanning instruments|
US5377011A|1991-09-06|1994-12-27|Koch; Stephen K.|Scanning system for three-dimensional object digitizing|
DE4134117C2|1991-10-15|1996-02-01|Kaltenbach & Voigt|Process for the optical measurement of objects|
FR2699677B1|1992-12-22|1995-03-03|Bertin & Cie|Method and device for determining the color of a transparent, diffusing and absorbing object, such as in particular a tooth.|
JP3321866B2|1992-12-28|2002-09-09|株式会社日立製作所|Surface shape detecting apparatus and method|
US5455899A|1992-12-31|1995-10-03|International Business Machines Corporation|High speed image data processing circuit|
JP3252877B2|1994-02-08|2002-02-04|富士通株式会社|Method and apparatus for checking stored data|
IL111229A|1994-10-10|1998-06-15|Nova Measuring Instr Ltd|Autofocusing microscope|
JP3559593B2|1994-10-26|2004-09-02|オリンパス株式会社|Endoscope device|
US5615003A|1994-11-29|1997-03-25|Hermary; Alexander T.|Electromagnetic profile scanner|
WO1996041304A1|1995-06-07|1996-12-19|The Trustees Of Columbia University In The City Of New York|Apparatus and methods for determining the three-dimensional shape of an object using active illumination and relative blurring in two images due to defocus|
DE19524855A1|1995-07-07|1997-01-09|Siemens Ag|Method and device for computer-aided restoration of teeth|
JPH0942940A|1995-08-03|1997-02-14|Canon Inc|Shape measuring method and device for three-dimensional object|
US5722412A|1996-06-28|1998-03-03|Advanced Technology Laboratories, Inc.|Hand held ultrasonic diagnostic instrument|
DE19640495C2|1996-10-01|1999-12-16|Leica Microsystems|Device for confocal surface measurement|
WO1998045745A1|1997-04-04|1998-10-15|Isis Innovation Limited|Microscopy imaging apparatus and method|
US6259452B1|1997-04-14|2001-07-10|Massachusetts Institute Of Technology|Image drawing system and method with real-time occlusion culling|
IL120892A|1997-05-22|2000-08-31|Cadent Ltd|Method for obtaining a dental occlusion map|
US6148120A|1997-10-30|2000-11-14|Cognex Corporation|Warping of focal images to correct correspondence error|
US6026189A|1997-11-13|2000-02-15|National Research Council Of Canada|Method of recognizing objects within two-dimensional and three-dimensional images|
US6185030B1|1998-03-20|2001-02-06|James W. Overbeck|Wide field of view and high speed scanning microscopy|
JP4136058B2|1998-03-27|2008-08-20|オリンパス株式会社|Endoscope|
IL125659A|1998-08-05|2002-09-12|Cadent Ltd|Method and apparatus for imaging three-dimensional structure|
US7099732B2|1999-03-29|2006-08-29|Genex Technologies, Inc.|Sanitary sleeve or tip for intra-oral three-dimensional camera|
US6227850B1|1999-05-13|2001-05-08|Align Technology, Inc.|Teeth viewing system|
US6648640B2|1999-11-30|2003-11-18|Ora Metrix, Inc.|Interactive orthodontic care system based on intra-oral scanning of teeth|
US7471821B2|2000-04-28|2008-12-30|Orametrix, Inc.|Method and apparatus for registering a known digital object to scanned 3-D model|
US6990228B1|1999-12-17|2006-01-24|Canon Kabushiki Kaisha|Image processing apparatus|
US6476803B1|2000-01-06|2002-11-05|Microsoft Corporation|Object modeling system and process employing noise elimination and robust surface extraction techniques|
US6865289B1|2000-02-07|2005-03-08|Canon Kabushiki Kaisha|Detection and removal of image occlusion errors|
EP1264281A4|2000-02-25|2007-07-11|Univ New York State Res Found|Apparatus and method for volume processing and rendering|
US7027642B2|2000-04-28|2006-04-11|Orametrix, Inc.|Methods for registration of three-dimensional frames to create three-dimensional virtual models of objects|
US6750873B1|2000-06-27|2004-06-15|International Business Machines Corporation|High quality texture reconstruction from multiple scans|
US7625335B2|2000-08-25|2009-12-01|3Shape Aps|Method and apparatus for three-dimensional optical scanning of interior surfaces|
GB0023681D0|2000-09-27|2000-11-08|Canon Kk|Image processing apparatus|
US6592371B2|2000-10-25|2003-07-15|Duane Durbin|Method and system for imaging and modeling a three dimensional structure|
US6645148B2|2001-03-20|2003-11-11|Vermon|Ultrasonic probe including pointing devices for remotely controlling functions of an associated imaging system|
SE523022C3|2001-03-23|2004-04-14|Cad Esthetics Ab|Procedure and apparatus for a dental restoration|
WO2002101534A1|2001-06-12|2002-12-19|Idelix Software Inc.|Graphical user interface with zoom for detail-in-context presentations|
US7907793B1|2001-05-04|2011-03-15|Legend Films Inc.|Image sequence depth enhancement system and method|
DE10125772C2|2001-05-26|2003-06-18|Duerr Dental Gmbh Co Kg|Dental or endoscopic camera|
US6904159B2|2001-12-20|2005-06-07|Mitsubishi Electric Research Laboratories, Inc.|Identifying moving objects in a video using volume growing and change detection masks|
GB0200819D0|2002-01-15|2002-03-06|Cole Polytechnique Federale De|Microscopy imaging apparatus and method for generating an image|
US7591780B2|2002-03-18|2009-09-22|Sterling Lc|Miniaturized imaging device with integrated circuit connector system|
JP4189798B2|2002-06-23|2008-12-03|明 石井|Constant magnification imaging method and apparatus using variable focus lens|
US20040155975A1|2002-09-17|2004-08-12|Hart Douglas P.|3-D imaging system|
US20070041729A1|2002-10-23|2007-02-22|Philip Heinz|Systems and methods for detecting changes in incident optical radiation at high frequencies|
US6847457B2|2002-10-29|2005-01-25|Mitutoyo Corporation|Interferometer using integrated imaging array and high-density phase-shifting array|
US7123760B2|2002-11-21|2006-10-17|General Electric Company|Method and apparatus for removing obstructing structures in CT imaging|
US20050142517A1|2003-12-30|2005-06-30|Howard Frysh|System for producing a dental implant and method|
EP1606576A4|2003-03-24|2006-11-22|D3D L P|Laser digitizer system for dental applications|
DE10321883A1|2003-05-07|2004-12-09|Universität Stuttgart|Triangulation measurement device for determining object 3D structure has illumination and observation arrays with a projected pattern being evaluated using cross correlation or phase shift analysis|
JP4216679B2|2003-09-25|2009-01-28|株式会社キーエンス|Displacement meter and displacement measurement method|
US7349104B2|2003-10-23|2008-03-25|Technest Holdings, Inc.|System and a method for three-dimensional imaging systems|
US7221332B2|2003-12-19|2007-05-22|Eastman Kodak Company|3D stereo OLED display|
US7756323B2|2004-01-15|2010-07-13|Technion Research & Development Foundation Ltd.|Three-dimensional video scanner|
WO2005071372A1|2004-01-23|2005-08-04|Olympus Corporation|Image processing system and camera|
DE102004014048B4|2004-03-19|2008-10-30|Sirona Dental Systems Gmbh|Measuring device and method according to the basic principle of confocal microscopy|
US20050212753A1|2004-03-23|2005-09-29|Marvit David L|Motion controlled remote controller|
US7292735B2|2004-04-16|2007-11-06|Microsoft Corporation|Virtual image artifact detection|
US7711179B2|2004-04-21|2010-05-04|Nextengine, Inc.|Hand held portable three dimensional scanner|
US7764589B2|2004-04-21|2010-07-27|Panasonic Corporation|Confocal optical system aperture detector that measures a light quantity balance of light received to detect a position displacement, and a confocal optical system aperture position controller, an optical head and a position detecting method performing the same|
GB0409463D0|2004-04-28|2004-06-02|Ibm|Method for removal of moving objects from a video stream|
DE602005004332T2|2004-06-17|2009-01-08|Cadent Ltd.|Method for providing data related to the oral cavity|
US20060020204A1|2004-07-01|2006-01-26|Bracco Imaging, S.P.A.|System and method for three-dimensional space management and visualization of ultrasound data |
US8060135B2|2004-07-29|2011-11-15|Sprint Spectrum L.P.|Method and system for selective application of cellular-PBX integration service|
CA2579676A1|2004-09-08|2007-02-15|Hana Golding|Compositions and methods for the detection of hiv-1/hiv-2 infection|
CN101426085B|2004-10-01|2012-10-03|小利兰·斯坦福大学托管委员会|Imaging arrangements and methods therefor|
WO2006039486A2|2004-10-01|2006-04-13|The Board Of Trustees Of The Leland Stanford Junior University|Imaging arrangements and methods therefor|
US7471450B2|2004-10-06|2008-12-30|Northeastern University|Confocal reflectance microscope system with dual rotating wedge scanner assembly|
US7683883B2|2004-11-02|2010-03-23|Pierre Touma|3D mouse and game controller based on spherical coordinates system and system for use|
US7492821B2|2005-02-08|2009-02-17|International Business Machines Corporation|System and method for selective image capture, transmission and reconstruction|
JP5154955B2|2005-03-03|2013-02-27|カデント・リミテツド|Oral scanning system and method|
US7609875B2|2005-05-27|2009-10-27|Orametrix, Inc.|Scanner system and method for mapping surface of three-dimensional object|
TWI310583B|2005-07-01|2009-06-01|Touch Micro System Tech|Method of thinning a wafer|
US7323951B2|2005-07-13|2008-01-29|John Mezzalinqua Associates, Inc.|Casing for CATV filter|
KR101170120B1|2005-07-27|2012-07-31|삼성전자주식회사|Stereoscopic display apparatus|
JP2007064802A|2005-08-31|2007-03-15|Sunx Ltd|Optical measuring instrument|
JP2007072103A|2005-09-06|2007-03-22|Fujifilm Holdings Corp|Camera|
US7605817B2|2005-11-09|2009-10-20|3M Innovative Properties Company|Determining camera motion|
KR20080090415A|2005-12-08|2008-10-08|피터 에스 러블리|Infrared dental imaging|
US8035637B2|2006-01-20|2011-10-11|3M Innovative Properties Company|Three-dimensional scan recovery|
US7912257B2|2006-01-20|2011-03-22|3M Innovative Properties Company|Real time display of acquired 3D dental data|
US20100009308A1|2006-05-05|2010-01-14|Align Technology, Inc.|Visualizing and Manipulating Digital Models for Dental Treatment|
JP2008032995A|2006-07-28|2008-02-14|Mitsutoyo Corp|Confocal microscope|
US20080118886A1|2006-11-21|2008-05-22|Rongguang Liang|Apparatus for dental oct imaging|
US8090194B2|2006-11-21|2012-01-03|Mantis Vision Ltd.|3D geometric modeling and motion capture using both single and dual imaging|
US7769230B2|2006-11-30|2010-08-03|Eastman Kodak Company|Producing low resolution images|
DE102007005726B4|2007-01-31|2010-05-12|Sirona Dental Systems Gmbh|Device and method for 3D optical measurement|
DE102007018048A1|2007-04-13|2008-10-16|Michael Schwertner|Method and arrangement for optical imaging with depth discrimination|
DE102007060263A1|2007-08-16|2009-02-26|Steinbichler Optotechnik Gmbh|Scanner for scanning e.g. teeth, in mouth of patient, has image optics arranged at distance to each other, where distance of optics and directions of optical axes are selected such that optics and axes are oriented to common area of tooth|
US9335912B2|2007-09-07|2016-05-10|Apple Inc.|GUI applications for use with 3D remote controller|
PL2051042T3|2007-10-18|2011-02-28|Nectar Imaging S R L|Device for tomographically recording objects|
TWI346309B|2007-12-21|2011-08-01|Ind Tech Res Inst|Method for reconstructing three dimension model|
US8103134B2|2008-02-20|2012-01-24|Samsung Electronics Co., Ltd.|Method and a handheld device for capturing motion|
US8121351B2|2008-03-09|2012-02-21|Microsoft International Holdings B.V.|Identification of objects in a 3D video using non/over reflective clothing|
US8129703B2|2008-03-12|2012-03-06|Optimet, Optical Metrology Ltd.|Intraoral imaging system and method based on conoscopic holography|
CN101673395B|2008-09-10|2012-09-05|华为终端有限公司|Image mosaic method and image mosaic device|
DE102008047816B4|2008-09-18|2011-08-25|Steinbichler Optotechnik GmbH, 83115|Device for determining the 3D coordinates of an object, in particular a tooth|
US8743114B2|2008-09-22|2014-06-03|Intel Corporation|Methods and systems to determine conservative view cell occlusion|
US8717416B2|2008-09-30|2014-05-06|Texas Instruments Incorporated|3D camera using flash with structured light|
CH699575A1|2008-10-06|2010-04-15|Nectar Imaging S R L|An optical system for a confocal microscope.|
EP2374023A1|2008-12-03|2011-10-12|Koninklijke Philips Electronics N.V.|Ultrasound assembly and system comprising interchangable transducers and displays|
US20100157086A1|2008-12-15|2010-06-24|Illumina, Inc|Dynamic autofocus method and system for assay imager|
EP2200332A1|2008-12-17|2010-06-23|Robert Bosch GmbH|Autostereoscopic display|
KR101199475B1|2008-12-22|2012-11-09|한국전자통신연구원|Method and apparatus for reconstruction 3 dimension model|
US8914098B2|2009-03-08|2014-12-16|Oprobe, Llc|Medical and veterinary imaging and diagnostic procedures utilizing optical probe systems|
DE102009025815A1|2009-05-15|2010-11-25|Degudent Gmbh|Measuring arrangement and method for three-dimensional measuring of an object|
US8564657B2|2009-05-29|2013-10-22|Honda Research Institute Europe Gmbh|Object motion detection system based on combining 3D warping techniques and a proper object motion detection|
CN104783757B|2009-06-17|2018-01-05|3形状股份有限公司|Focus on scanning device|
JP4758499B2|2009-07-13|2011-08-31|株式会社バンダイナムコゲームス|Image generation system and information storage medium|
US8547374B1|2009-07-24|2013-10-01|Lockheed Martin Corporation|Detection and reconstruction of 3D objects with passive imaging sensors|
US8867820B2|2009-10-07|2014-10-21|Microsoft Corporation|Systems and methods for removing a background of an image|
EP2491527B1|2009-10-22|2013-07-31|Tomtom Belgium N.V.|Method for creating a mosaic image using masks|
US9208612B2|2010-02-12|2015-12-08|The University Of North Carolina At Chapel Hill|Systems and methods that generate height map models for efficient three dimensional reconstruction from depth information|
US20110200249A1|2010-02-17|2011-08-18|Harris Corporation|Surface detection in images based on spatial data|
WO2011127375A1|2010-04-09|2011-10-13|Pochiraju Kishore V|Adaptive mechanism control and scanner positioning for improved three-dimensional laser scanning|
US8260539B2|2010-05-12|2012-09-04|GM Global Technology Operations LLC|Object and vehicle detection and tracking using 3-D laser rangefinder|
US20110310449A1|2010-06-17|2011-12-22|Eun-Soo Kim|Method for generating 3d video computer-generated hologram using look-up table and temporal redundancy and apparatus thereof|
WO2012011101A2|2010-07-19|2012-01-26|Cadent Ltd.|Methods and systems for creating and interacting with three dimensional virtual models|
US8526700B2|2010-10-06|2013-09-03|Robert E. Isaacs|Imaging system and method for surgical and interventional medical procedures|
US8849015B2|2010-10-12|2014-09-30|3D Systems, Inc.|System and apparatus for haptically enabled three-dimensional scanning|
WO2012061549A2|2010-11-03|2012-05-10|3Dmedia Corporation|Methods, systems, and computer program products for creating three-dimensional video sequences|
EP3431921B1|2010-12-06|2020-02-05|3Shape A/S|System with 3d user interface integration|
US20120179035A1|2011-01-07|2012-07-12|General Electric Company|Medical device with motion sensing|
US8401225B2|2011-01-31|2013-03-19|Microsoft Corporation|Moving object segmentation using depth images|
CN103391747B|2011-02-22|2016-02-24|3M创新有限公司|Space engraving in 3D data acquisition|
US8897526B2|2011-05-06|2014-11-25|Sirona Dental Systems Gmbh|Method, system, and computer-readable medium for uncovering and planning an accurate dental preparation|
EP3401876B1|2011-07-15|2020-02-26|3Shape A/S|Detection of a movable object when 3d scanning a rigid object|
EP2775256B1|2013-03-07|2019-07-10|a.tron3d GmbH|Method for optically detecting the three-dimensional geometry of objects|DE602005004332T2|2004-06-17|2009-01-08|Cadent Ltd.|Method for providing data related to the oral cavity|
WO2009126961A1|2008-04-11|2009-10-15|Mds Analytical TechnologiesInc.|Vibration control in scanners|
BRPI0918994A2|2008-09-22|2017-06-13|SoundBeam LLC|device, and method for transmitting an audio signal to a user.|
CH699575A1|2008-10-06|2010-04-15|Nectar Imaging S R L|An optical system for a confocal microscope.|
CN104783757B|2009-06-17|2018-01-05|3形状股份有限公司|Focus on scanning device|
CN101996021B|2009-08-12|2013-02-13|幻音科技有限公司|Handheld electronic equipment and method for controlling display contents thereby|
US8765031B2|2009-08-13|2014-07-01|Align Technology, Inc.|Method of forming a dental appliance|
DE102009038588A1|2009-08-26|2011-03-24|Degudent Gmbh|Method for determining a complete data record of an object to be measured|
WO2011120526A1|2010-03-30|2011-10-06|3Shape A/S|Scanning of cavities with restricted accessibility|
US9241774B2|2010-04-30|2016-01-26|Align Technology, Inc.|Patterned dental positioning appliance|
WO2011163359A2|2010-06-23|2011-12-29|The Trustees Of Dartmouth College|3d scanning laser systems and methods for determining surface geometry of an immersed object in a transparent cylindrical glass tank|
WO2012000542A1|2010-06-30|2012-01-05|Brainlab Ag|Medical image registration using a rigid inner body surface|
ES2788853T3|2010-12-06|2020-10-23|3Shape As|System with 3D user interface integration|
EP3431921B1|2010-12-06|2020-02-05|3Shape A/S|System with 3d user interface integration|
DK2656639T3|2010-12-20|2020-06-29|Earlens Corp|Anatomically adapted ear canal hearing aid|
WO2012083967A1|2010-12-21|2012-06-28|3Shape A/S|Optical system in 3D focus scanner|
EP2663254B1|2011-01-13|2020-07-29|Align Technology, Inc.|Methods, systems and accessories useful for procedures relating to dental implants|
TWI432009B|2011-01-14|2014-03-21|Genesys Logic Inc|Hand-held scanning system and method thereof|
US8900126B2|2011-03-23|2014-12-02|United Sciences, Llc|Optical scanning device|
GB201107225D0|2011-04-29|2011-06-15|Peira Bvba|Stereo-vision system|
EP2527784A1|2011-05-19|2012-11-28|Hexagon Technology Center GmbH|Optical measurement method and system for determining 3D coordinates of a measured object surface|
DE102011102095A1|2011-05-19|2012-11-22|Deltamed Gmbh|Multi-part molded part, particularly for prosthesis, comprises two or multiple individual components, where two individual components are arranged adjacent to each other, such that they are joined with each other by contact surface|
WO2012168322A2|2011-06-06|2012-12-13|3Shape A/S|Dual-resolution 3d scanner|
US9444981B2|2011-07-26|2016-09-13|Seikowave, Inc.|Portable structured light measurement module/apparatus with pattern shifting device incorporating a fixed-pattern optic for illuminating a subject-under-test|
GB201113071D0|2011-07-29|2011-09-14|Ffei Ltd|Method and apparatus for image scanning|
CA2788399A1|2011-08-31|2013-02-28|Woodtech Measurement Solutions|System and method for variability detection in bundled objects|
US9107613B2|2011-09-06|2015-08-18|Provel, Inc.|Handheld scanning device|
US9403238B2|2011-09-21|2016-08-02|Align Technology, Inc.|Laser cutting|
EP2587313B1|2011-10-20|2016-05-11|Samsung Electronics Co., Ltd|Optical measurement system and method for measuring critical dimension of nanostructure|
EP2750603A1|2011-10-21|2014-07-09|Koninklijke Philips N.V.|Method and apparatus for determining anatomic properties of a patient|
WO2013105922A2|2011-12-12|2013-07-18|Zygo Corporation|Non-contact surface characterization using modulated illumination|
US9742993B2|2012-02-16|2017-08-22|University Of Washington Through Its Center For Commercialization|Extended depth of focus for high-resolution optical image scanning|
US9220580B2|2012-03-01|2015-12-29|Align Technology, Inc.|Determining a dental treatment difficulty|
WO2013132091A1|2012-03-09|2013-09-12|3Shape A/S|3d scanner with steam autoclavable tip containing a heated optical element|
US9414897B2|2012-05-22|2016-08-16|Align Technology, Inc.|Adjustment of tooth position in a virtual dental model|
JP6430934B2|2012-06-27|2018-11-28|3シェイプ アー/エス|Intraoral 3D scanner to measure fluorescence|
DE102013108457A1|2012-10-05|2014-04-10|Werth Messtechnik Gmbh|Method and device for illuminating and measuring an object|
DE102012220048B4|2012-11-02|2018-09-20|Sirona Dental Systems Gmbh|Calibration device and method for calibrating a dental camera|
US8905757B2|2012-12-03|2014-12-09|E. Kats Enterprises Ltd.|Method and apparatus for measuring a location and orientation of a plurality of implants|
JP6038644B2|2012-12-27|2016-12-07|株式会社モリタ製作所|Biological imaging device|
US9652797B2|2013-01-18|2017-05-16|24/7 Customer, Inc.|Intent prediction based recommendation system using data combined from multiple channels|
FR3001564B1|2013-01-31|2016-05-27|Vit|SYSTEM FOR DETERMINING A THREE-DIMENSIONAL IMAGE OF AN ELECTRONIC CIRCUIT|
JP6849708B2|2013-02-13|2021-03-24|3シェイプ アー/エス|Focus scanning device that records colors|
CN105263437B|2013-02-13|2017-10-03|3形状股份有限公司|Record the focusing scanning means of color|
US20140272765A1|2013-03-14|2014-09-18|Ormco Corporation|Feedback control mechanism for adjustment of imaging parameters in a dental imaging system|
WO2014158150A1|2013-03-27|2014-10-02|Seikowave, Inc.|Portable structured light measurement module/apparatus with pattern shifting device incorporating a fixed-pattern optic for illuminating a subject-under-test|
US10219724B2|2013-05-02|2019-03-05|VS Medtech, Inc.|Systems and methods for measuring and characterizing interior surfaces of luminal structures|
DE102013212111A1|2013-06-25|2015-01-22|Henke-Sass, Wolf Gmbh|Endoscope and endoscopy procedure|
US9393087B2|2013-08-01|2016-07-19|Align Technology, Inc.|Methods and systems for generating color images|
US10152529B2|2013-08-23|2018-12-11|Elwha Llc|Systems and methods for generating a treatment map|
WO2015026969A1|2013-08-23|2015-02-26|Elwha Llc|Systems, methods, and devices for assessing microbiota of skin|
US9456777B2|2013-08-23|2016-10-04|Elwha Llc|Systems, methods, and devices for assessing microbiota of skin|
US10010704B2|2013-08-23|2018-07-03|Elwha Llc|Systems, methods, and devices for delivering treatment to a skin surface|
US9805171B2|2013-08-23|2017-10-31|Elwha Llc|Modifying a cosmetic product based on a microbe profile|
EP3035994B1|2013-08-23|2019-04-17|Elwha LLC|Systems, methods, and devices for delivering treatment to a skin surface|
US9811641B2|2013-08-23|2017-11-07|Elwha Llc|Modifying a cosmetic product based on a microbe profile|
US9390312B2|2013-08-23|2016-07-12|Elwha Llc|Systems, methods, and devices for assessing microbiota of skin|
US9557331B2|2013-08-23|2017-01-31|Elwha Llc|Systems, methods, and devices for assessing microbiota of skin|
DE102013218231A1|2013-09-11|2015-03-12|Sirona Dental Systems Gmbh|Optical system for generating a time-varying pattern for a confocal microscope|
WO2015039210A1|2013-09-18|2015-03-26|Matter and Form Inc.|Device, system and method for three-dimensional modeling|
KR101538760B1|2013-11-20|2015-07-24|이태경|Scanner for Oral Cavity|
US8805088B1|2013-12-16|2014-08-12|Google Inc.|Specularity determination from images|
JP2015128242A|2013-12-27|2015-07-09|ソニー株式会社|Image projection device and calibration method of the same|
WO2015118120A1|2014-02-07|2015-08-13|3Shape A/S|Detecting tooth shade|
BE1022554B1|2014-02-26|2016-06-01|Centre De Recherches Metallurgiques Asbl-Centrum Voor Research In Metallurgie Vzw|DEVICE FOR 3D MEASUREMENT OF THE TOPOGRAPH OF PRODUCTS IN PROGRESS|
US20150260509A1|2014-03-11|2015-09-17|Jonathan Kofman|Three dimensionalimaging by a mobile communication device|
CN103913118A|2014-04-10|2014-07-09|深圳先进技术研究院|Three-dimensional scanning device|
DE102014210389A1|2014-06-03|2015-12-03|Robert Bosch Gmbh|Laser scanner projector with color measurement|
US9261356B2|2014-07-03|2016-02-16|Align Technology, Inc.|Confocal surface topography measurement with fixed focal positions|
US9261358B2|2014-07-03|2016-02-16|Align Technology, Inc.|Chromatic confocal system|
US10772506B2|2014-07-07|2020-09-15|Align Technology, Inc.|Apparatus for dental confocal imaging|
EP3169396B1|2014-07-14|2021-04-21|Earlens Corporation|Sliding bias and peak limiting for optical hearing devices|
US9693839B2|2014-07-17|2017-07-04|Align Technology, Inc.|Probe head and apparatus for intraoral confocal imaging using polarization-retarding coatings|
DE102014216227B4|2014-08-14|2020-06-18|Carl Zeiss Microscopy Gmbh|Method and device for determining a distance between two optical interfaces spaced apart from one another along a first direction|
US9675430B2|2014-08-15|2017-06-13|Align Technology, Inc.|Confocal imaging apparatus with curved focal surface|
US9602811B2|2014-09-10|2017-03-21|Faro Technologies, Inc.|Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device|
DE102014013677B4|2014-09-10|2017-06-22|Faro Technologies, Inc.|Method for optically scanning and measuring an environment with a handheld scanner and subdivided display|
DE202014010357U1|2014-09-10|2015-12-11|Faro Technologies, Inc.|Device for optically scanning and measuring an environment with a hand-held scanner and control by gestures|
US9693040B2|2014-09-10|2017-06-27|Faro Technologies, Inc.|Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device|
DE102014013678B3|2014-09-10|2015-12-03|Faro Technologies, Inc.|Method for optically sensing and measuring an environment with a handheld scanner and gesture control|
EP3193699A4|2014-09-16|2018-03-07|Carestream Dental Technology Topco Limited|Dental surface imaging apparatus using laser projection|
US10449016B2|2014-09-19|2019-10-22|Align Technology, Inc.|Arch adjustment appliance|
US9610141B2|2014-09-19|2017-04-04|Align Technology, Inc.|Arch expanding appliance|
US9744001B2|2014-11-13|2017-08-29|Align Technology, Inc.|Dental appliance with cavity for an unerupted or erupting tooth|
US9924276B2|2014-11-26|2018-03-20|Earlens Corporation|Adjustable venting for hearing instruments|
IL235950D0|2014-11-27|2015-02-26|Imaging Solutions Ltd Ab|3d scanners for simultaneous acquisition of 3d data sets of 3d objects|
US10504386B2|2015-01-27|2019-12-10|Align Technology, Inc.|Training method and system for oral-cavity-imaging-and-modeling equipment|
DE102015208285A1|2015-05-05|2016-11-10|Friedrich-Schiller-Universität Jena|DEVICE AND METHOD FOR SPATIAL MEASUREMENT OF SURFACES|
DE102015209404B4|2015-05-22|2018-05-03|Sirona Dental Systems Gmbh|Method and camera for three-dimensional measurement of a dental object|
DE102015209410B4|2015-05-22|2018-05-09|Sirona Dental Systems Gmbh|Camera and method for the three-dimensional measurement of a dental object|
DE102015209402A1|2015-05-22|2016-11-24|Sirona Dental Systems Gmbh|Device for optical 3D measurement of an object|
EP3304164B1|2015-06-02|2019-06-26|Life Technologies Corporation|Systems and methods for an interchangeable illumination filter set for use in a structured illumination imaging system|
US20160366395A1|2015-06-12|2016-12-15|Microsoft Technology Licensing, Llc|Led surface emitting structured light|
US10289875B2|2015-07-31|2019-05-14|Portland State University|Embedding data on objects using surface modulation|
US10248883B2|2015-08-20|2019-04-02|Align Technology, Inc.|Photograph-based assessment of dental treatments and procedures|
US20170095202A1|2015-10-02|2017-04-06|Earlens Corporation|Drug delivery customized ear canal apparatus|
US9762712B2|2015-10-30|2017-09-12|Essential Products, Inc.|System and method for reducing the number of ports associated with a mobile device|
US9591212B1|2015-10-30|2017-03-07|Essential Products, Inc.|System and method for reducing the number of ports associated with a mobile device|
US10426351B2|2015-11-10|2019-10-01|Quantum Dental Technologies Inc.|Systems and methods for spatial positioning of diagnostic and or treatment probe based on surface profile detection|
US10220172B2|2015-11-25|2019-03-05|Resmed Limited|Methods and systems for providing interface components for respiratory therapy|
US11103330B2|2015-12-09|2021-08-31|Align Technology, Inc.|Dental attachment placement structure|
CN105547194B|2015-12-15|2018-03-27|宁波频泰光电科技有限公司|A kind of colored 3D measuring systems|
CN105547195B|2015-12-15|2018-04-17|宁波频泰光电科技有限公司|A kind of colour 3D measuring systems|
CN105547191B|2015-12-15|2018-03-27|宁波频泰光电科技有限公司|A kind of colored 3D measuring systems|
CN105547193A|2015-12-15|2016-05-04|宁波频泰光电科技有限公司|Colorful 3D measuring system|
CN105571522A|2015-12-15|2016-05-11|宁波频泰光电科技有限公司|Color 3D measurement system|
CN105547192B|2015-12-15|2018-04-17|宁波频泰光电科技有限公司|A kind of colour 3D measuring systems|
US10306381B2|2015-12-30|2019-05-28|Earlens Corporation|Charging protocol for rechargable hearing systems|
CN106950687B|2016-01-06|2021-01-01|松下知识产权经营株式会社|Image generation system and image generation method|
US9858672B2|2016-01-15|2018-01-02|Oculus Vr, Llc|Depth mapping using structured light and time of flight|
US10634487B2|2016-02-01|2020-04-28|Kla-Tencor Corporation|Method and system for optical three dimensional topography measurement|
KR20170093445A|2016-02-05|2017-08-16|주식회사바텍|Dental three-dimensional scanner using color pattern|
US10966803B2|2016-05-31|2021-04-06|Carestream Dental Technology Topco Limited|Intraoral 3D scanner with fluid segmentation|
EP3471599A4|2016-06-17|2020-01-08|Align Technology, Inc.|Intraoral appliances with sensing|
WO2017218951A1|2016-06-17|2017-12-21|Align Technology, Inc.|Orthodontic appliance performance monitor|
BR112018076410A2|2016-06-24|2019-04-09|3Shape A/S|3d scanner using a structured probe light beam|
CN106019550B|2016-07-12|2019-05-24|上海交通大学|Dynamic focusing device and focusing tracking for the micro- scanning of high speed|
WO2018011331A1|2016-07-13|2018-01-18|Naked Labs Austria Gmbh|Motor driven turntable with foldable sensor mast|
WO2018012862A1|2016-07-13|2018-01-18|문정본|Three-dimensional scanner and apparatus for processing artificial object using same|
US10507087B2|2016-07-27|2019-12-17|Align Technology, Inc.|Methods and apparatuses for forming a three-dimensional volumetric model of a subject's teeth|
EP3578131B1|2016-07-27|2020-12-09|Align Technology, Inc.|Intraoral scanner with dental diagnostics capabilities|
US10084979B2|2016-07-29|2018-09-25|International Business Machines Corporation|Camera apparatus and system, method and recording medium for indicating camera field of view|
US20180077504A1|2016-09-09|2018-03-15|Earlens Corporation|Contact hearing systems, apparatus and methods|
WO2018057711A1|2016-09-21|2018-03-29|Johnson Philip M|Non-contact coordinate measuring machine using hybrid cyclic binary code structured light|
NL2017513B1|2016-09-22|2018-03-29|Ccm Beheer Bv|Scanning system for creating 3D model|
JP6768442B2|2016-10-12|2020-10-14|株式会社キーエンス|Shape measuring device|
CN113648088A|2016-11-04|2021-11-16|阿莱恩技术有限公司|Method and apparatus for dental images|
WO2018093733A1|2016-11-15|2018-05-24|Earlens Corporation|Improved impression procedure|
US11026831B2|2016-12-02|2021-06-08|Align Technology, Inc.|Dental appliance features for speech enhancement|
CN110062609B|2016-12-02|2021-07-06|阿莱恩技术有限公司|Method and apparatus for customizing a rapid palate expander using a digital model|
US10548700B2|2016-12-16|2020-02-04|Align Technology, Inc.|Dental appliance etch template|
EP3562379A1|2016-12-30|2019-11-06|Barco NV|System and method for camera calibration|
US10456043B2|2017-01-12|2019-10-29|Align Technology, Inc.|Compact confocal dental scanning apparatus|
US10779718B2|2017-02-13|2020-09-22|Align Technology, Inc.|Cheek retractor and mobile device holder|
CN107101982A|2017-03-09|2017-08-29|深圳先进技术研究院|Fluorescence microscopy device|
JP6786424B2|2017-03-13|2020-11-18|株式会社モリタ製作所|3D scanner|
US10463243B2|2017-03-16|2019-11-05|Carestream Dental Technology Topco Limited|Structured light generation for intraoral 3D camera using 1D MEMS scanning|
JP6766000B2|2017-03-17|2020-10-07|株式会社モリタ製作所|3D scanner|
US10613515B2|2017-03-31|2020-04-07|Align Technology, Inc.|Orthodontic appliances including at least partially un-erupted teeth and method of forming them|
DE102017003231A1|2017-04-03|2018-10-04|Mühlbauer Gmbh & Co. Kg|Optical component detection system and method for detecting at least one component|
US10600203B2|2017-06-06|2020-03-24|CapSen Robotics, Inc.|Three-dimensional scanner with detector pose identification|
US11045283B2|2017-06-09|2021-06-29|Align Technology, Inc.|Palatal expander with skeletal anchorage devices|
US10708574B2|2017-06-15|2020-07-07|Align Technology, Inc.|Three dimensional imaging apparatus with color sensor|
US10639134B2|2017-06-26|2020-05-05|Align Technology, Inc.|Biosensor performance indicator for intraoral appliances|
US10885521B2|2017-07-17|2021-01-05|Align Technology, Inc.|Method and apparatuses for interactive ordering of dental aligners|
EP3658067A1|2017-07-27|2020-06-03|Align Technology, Inc.|System and methods for processing an orthodontic aligner by means of an optical coherence tomography|
US11116605B2|2017-08-15|2021-09-14|Align Technology, Inc.|Buccal corridor assessment and computation|
WO2019036677A1|2017-08-17|2019-02-21|Align Technology, Inc.|Dental appliance compliance monitoring|
EP3451023A1|2017-09-01|2019-03-06|Koninklijke Philips N.V.|Time-of-flight depth camera with low resolution pixel imaging|
EP3625605A4|2017-09-29|2021-03-24|Leica Biosystems Imaging, Inc.|Two-dimensional and three-dimensional fixed z scanning|
US10813720B2|2017-10-05|2020-10-27|Align Technology, Inc.|Interproximal reduction templates|
EP3692329A4|2017-10-06|2021-06-23|Aaron Bernstein|Generation of one or more edges of luminosity to form three-dimensional models of objects|
US11096763B2|2017-11-01|2021-08-24|Align Technology, Inc.|Automatic treatment planning|
KR20190051696A|2017-11-07|2019-05-15|삼성전자주식회사|Meta projector and electronic apparatus including the same|
CN111417357A|2017-11-30|2020-07-14|阿莱恩技术有限公司|Sensor for monitoring oral appliance|
US10980613B2|2017-12-29|2021-04-20|Align Technology, Inc.|Augmented reality enhancements for dental practitioners|
US11013581B2|2018-01-26|2021-05-25|Align Technology, Inc.|Diagnostic intraoral methods and apparatuses|
WO2019147936A1|2018-01-26|2019-08-01|Vanderbilt University|Systems and methods for non-destructive evaluation of optical material properties and surfaces|
KR20200123160A|2018-02-16|2020-10-28|쓰리세이프 에이/에스|Intra-premises scanning with surface differentiation|
DE102018105132A1|2018-03-06|2019-09-12|Kappa optronics GmbH|Triangulationsvorrichtung|
WO2019199680A1|2018-04-09|2019-10-17|Earlens Corporation|Dynamic filter|
PL425395A1|2018-04-30|2019-11-04|Milton Essex Spolka Akcyjna|Apparatus for multimodal analysis of allergic reaction in the course of dermal tests and the hybrid method of multispectral imaging of allergic reactions in the course of dermal tests and its application for automatic evaluation of test results|
US10753734B2|2018-06-08|2020-08-25|Dentsply Sirona Inc.|Device, method and system for generating dynamic projection patterns in a confocal camera|
US10819706B2|2018-07-09|2020-10-27|Igt|System, apparatus and method for facilitating remote gaming communications in a venue|
JP6883559B2|2018-08-31|2021-06-09|株式会社モリタ製作所|Medical medical equipment|
JP6968046B2|2018-08-31|2021-11-17|株式会社モリタ製作所|3D measurement system, 3D measurement device, and control program|
CN109470146B|2018-12-07|2020-06-09|哈尔滨工业大学|High-resolution stereo vision system and measuring method|
CN109596063B|2018-12-07|2020-07-28|哈尔滨工业大学|Multi-wavelength high-resolution stereo vision measuring device and method|
CN109470144B|2018-12-07|2020-11-20|哈尔滨工业大学|Line scanning high-resolution stereo vision measuring system and method|
CN109470147B|2018-12-07|2020-05-22|哈尔滨工业大学|Self-adaptive high-resolution stereo vision system and measuring method|
CN109470143B|2018-12-07|2020-07-28|哈尔滨工业大学|External light source high-resolution stereo vision measuring system and method|
CN109470145A|2018-12-07|2019-03-15|哈尔滨工业大学|Polarization Modulation high resolution Stereo Vision Measurement System and method|
CN109579700B|2018-12-07|2020-07-28|哈尔滨工业大学|Disc scanning high-resolution stereo vision measuring system and method|
SG11202105401TA|2018-12-27|2021-07-29|Johnson & Johnson Consumer Inc|Device and method for selective application of topical composition using dynamic threshold values|
CA3121196A1|2018-12-27|2020-07-02|Johnson & Johnson Consumer Inc.|Device and method for application of topical compositions guided by projected fiducials|
CN109668869A|2018-12-28|2019-04-23|中国科学院长春光学精密机械与物理研究所|A kind of hand-held reflection Confocal laser-scanning microscopy detection device|
CN109739016A|2019-01-16|2019-05-10|中国科学院苏州生物医学工程技术研究所|Based on Structured Illumination microscope rapid three dimensional imaging system and synchronisation control means|
EP3701908A1|2019-02-28|2020-09-02|Sirona Dental Systems GmbH|3d intraoral scanner|
JP6777784B2|2019-03-04|2020-10-28|Ckd株式会社|Manufacturing method for inspection equipment, blister packaging machines and blister packs|
KR102229270B1|2019-03-06|2021-03-18|주식회사 디디에스|Method and 3d oral scanner for forming structured light for a subject using a complementary color pattern|
CN110095088B|2019-05-14|2020-11-20|哈尔滨理工大学|Method and device for detecting surface topography characteristics of curved surface splicing area based on grating identification|
GB2589071A|2019-11-01|2021-05-26|King S College London|Dental imaging|
WO2021239583A2|2020-05-26|2021-12-02|Dentsply Sirona Inc.|Method and apparatus for multimodal soft tissue diagnostics|
JP2020151497A|2020-06-01|2020-09-24|株式会社モリタ製作所|Three-dimensional scanner|
KR20220016643A|2020-08-03|2022-02-10|오스템임플란트 주식회사|3-dimensional intraoral scanner|
法律状态:
2018-05-15| B11A| Dismissal acc. art.33 of ipl - examination not requested within 36 months of filing|
2018-06-05| B04C| Request for examination: application reinstated [chapter 4.3 patent gazette]|
2019-01-15| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-07-16| B06T| Formal requirements before examination [chapter 6.20 patent gazette]|
2019-11-19| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2020-01-21| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 17/06/2010, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US18774409P| true| 2009-06-17|2009-06-17|
US61/187,744|2009-06-17|
US23111809P| true| 2009-08-04|2009-08-04|
US61/231,118|2009-08-04|
PCT/DK2010/050148|WO2010145669A1|2009-06-17|2010-06-17|Focus scanning apparatus|
[返回顶部]